[ 553.158875] env[59327]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 553.600742] env[59369]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 555.120169] env[59369]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' {{(pid=59369) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} [ 555.120593] env[59369]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' {{(pid=59369) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} [ 555.120593] env[59369]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' {{(pid=59369) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} [ 555.120866] env[59369]: INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs [ 555.121947] env[59369]: WARNING nova.servicegroup.api [-] Report interval must be less than service down time. Current config: . Setting service_down_time to: 300 [ 555.237330] env[59369]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm {{(pid=59369) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} [ 555.247450] env[59369]: DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.010s {{(pid=59369) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} [ 555.347710] env[59369]: INFO nova.virt.driver [None req-5639f8af-308a-4c97-b7b3-55b7bbfb68a5 None None] Loading compute driver 'vmwareapi.VMwareVCDriver' [ 555.420592] env[59369]: DEBUG oslo_concurrency.lockutils [-] Acquiring lock "oslo_vmware_api_lock" by "oslo_vmware.api.VMwareAPISession._create_session" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 555.420749] env[59369]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" acquired by "oslo_vmware.api.VMwareAPISession._create_session" :: waited 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 555.420839] env[59369]: DEBUG oslo_vmware.service [-] Creating suds client with soap_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk' and wsdl_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk/vimService.wsdl' {{(pid=59369) __init__ /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:242}} [ 558.598076] env[59369]: DEBUG oslo_vmware.service [-] Invoking ServiceInstance.RetrieveServiceContent with opID=oslo.vmware-a98a24d7-2597-41e1-92e2-1ab104d39f3c {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 558.612673] env[59369]: DEBUG oslo_vmware.api [-] Logging into host: vc1.osci.c.eu-de-1.cloud.sap. {{(pid=59369) _create_session /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:242}} [ 558.612790] env[59369]: DEBUG oslo_vmware.service [-] Invoking SessionManager.Login with opID=oslo.vmware-2062c7e0-1e03-4390-bbd9-4a6290d553d8 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 558.643852] env[59369]: INFO oslo_vmware.api [-] Successfully established new session; session ID is e41b5. [ 558.643979] env[59369]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" "released" by "oslo_vmware.api.VMwareAPISession._create_session" :: held 3.223s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 558.644608] env[59369]: INFO nova.virt.vmwareapi.driver [None req-5639f8af-308a-4c97-b7b3-55b7bbfb68a5 None None] VMware vCenter version: 7.0.3 [ 558.648064] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aaa05a89-5821-4deb-9170-902192e26203 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 558.666592] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d99543e-3f3a-4f63-bb7f-7eea79322f2e {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 558.672857] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0bc35de2-9c49-40ed-a84f-5e4403a615b5 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 558.679622] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6852fbcb-31c1-4e42-856c-61db76298ac9 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 558.692378] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b1f8ead8-84c0-49c4-a460-40e283787a9e {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 558.698151] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1c6360c8-2be9-4e79-a2f5-38903c150fa1 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 558.727868] env[59369]: DEBUG oslo_vmware.service [-] Invoking ExtensionManager.FindExtension with opID=oslo.vmware-3bd52cfb-ff38-4e2c-9f9e-787b61bb59e4 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 558.732544] env[59369]: DEBUG nova.virt.vmwareapi.driver [None req-5639f8af-308a-4c97-b7b3-55b7bbfb68a5 None None] Extension org.openstack.compute already exists. {{(pid=59369) _register_openstack_extension /opt/stack/nova/nova/virt/vmwareapi/driver.py:214}} [ 558.735183] env[59369]: INFO nova.compute.provider_config [None req-5639f8af-308a-4c97-b7b3-55b7bbfb68a5 None None] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access. [ 558.751615] env[59369]: DEBUG nova.context [None req-5639f8af-308a-4c97-b7b3-55b7bbfb68a5 None None] Found 2 cells: 00000000-0000-0000-0000-000000000000(cell0),f0287f1d-9751-4826-968a-28c4eab6da24(cell1) {{(pid=59369) load_cells /opt/stack/nova/nova/context.py:464}} [ 558.753613] env[59369]: DEBUG oslo_concurrency.lockutils [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 558.753837] env[59369]: DEBUG oslo_concurrency.lockutils [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 558.754579] env[59369]: DEBUG oslo_concurrency.lockutils [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 558.754922] env[59369]: DEBUG oslo_concurrency.lockutils [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] Acquiring lock "f0287f1d-9751-4826-968a-28c4eab6da24" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 558.755118] env[59369]: DEBUG oslo_concurrency.lockutils [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] Lock "f0287f1d-9751-4826-968a-28c4eab6da24" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 558.756075] env[59369]: DEBUG oslo_concurrency.lockutils [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] Lock "f0287f1d-9751-4826-968a-28c4eab6da24" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 558.768344] env[59369]: DEBUG oslo_db.sqlalchemy.engines [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=59369) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 558.768695] env[59369]: DEBUG oslo_db.sqlalchemy.engines [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=59369) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 558.775583] env[59369]: ERROR nova.db.main.api [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] No DB access allowed in nova-compute: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 558.775583] env[59369]: result = function(*args, **kwargs) [ 558.775583] env[59369]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 558.775583] env[59369]: return func(*args, **kwargs) [ 558.775583] env[59369]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 558.775583] env[59369]: result = fn(*args, **kwargs) [ 558.775583] env[59369]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 558.775583] env[59369]: return f(*args, **kwargs) [ 558.775583] env[59369]: File "/opt/stack/nova/nova/objects/service.py", line 546, in _db_service_get_minimum_version [ 558.775583] env[59369]: return db.service_get_minimum_version(context, binaries) [ 558.775583] env[59369]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 558.775583] env[59369]: _check_db_access() [ 558.775583] env[59369]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 558.775583] env[59369]: stacktrace = ''.join(traceback.format_stack()) [ 558.775583] env[59369]: [ 558.776341] env[59369]: ERROR nova.db.main.api [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] No DB access allowed in nova-compute: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 558.776341] env[59369]: result = function(*args, **kwargs) [ 558.776341] env[59369]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 558.776341] env[59369]: return func(*args, **kwargs) [ 558.776341] env[59369]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 558.776341] env[59369]: result = fn(*args, **kwargs) [ 558.776341] env[59369]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 558.776341] env[59369]: return f(*args, **kwargs) [ 558.776341] env[59369]: File "/opt/stack/nova/nova/objects/service.py", line 546, in _db_service_get_minimum_version [ 558.776341] env[59369]: return db.service_get_minimum_version(context, binaries) [ 558.776341] env[59369]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 558.776341] env[59369]: _check_db_access() [ 558.776341] env[59369]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 558.776341] env[59369]: stacktrace = ''.join(traceback.format_stack()) [ 558.776341] env[59369]: [ 558.776692] env[59369]: WARNING nova.objects.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] Failed to get minimum service version for cell f0287f1d-9751-4826-968a-28c4eab6da24 [ 558.776817] env[59369]: WARNING nova.objects.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] Failed to get minimum service version for cell 00000000-0000-0000-0000-000000000000 [ 558.777251] env[59369]: DEBUG oslo_concurrency.lockutils [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] Acquiring lock "singleton_lock" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 558.777406] env[59369]: DEBUG oslo_concurrency.lockutils [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] Acquired lock "singleton_lock" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 558.777642] env[59369]: DEBUG oslo_concurrency.lockutils [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] Releasing lock "singleton_lock" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 558.777955] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] Full set of CONF: {{(pid=59369) _wait_for_exit_or_signal /usr/local/lib/python3.10/dist-packages/oslo_service/service.py:362}} [ 558.778102] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] ******************************************************************************** {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2589}} [ 558.778232] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] Configuration options gathered from: {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2590}} [ 558.778361] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] command line args: ['--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-cpu-common.conf', '--config-file', '/etc/nova/nova-cpu-1.conf'] {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2591}} [ 558.778551] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2592}} [ 558.778676] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] ================================================================================ {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2594}} [ 558.778874] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] allow_resize_to_same_host = True {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.779051] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] arq_binding_timeout = 300 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.779182] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] backdoor_port = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.779306] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] backdoor_socket = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.779465] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] block_device_allocate_retries = 60 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.779622] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] block_device_allocate_retries_interval = 3 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.779785] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] cert = self.pem {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.779945] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] compute_driver = vmwareapi.VMwareVCDriver {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.780119] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] compute_monitors = [] {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.780285] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] config_dir = [] {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.780448] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] config_drive_format = iso9660 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.780578] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] config_file = ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.780736] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] config_source = [] {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.780895] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] console_host = devstack {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.781094] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] control_exchange = nova {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.781273] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] cpu_allocation_ratio = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.781432] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] daemon = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.781596] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] debug = True {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.781746] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] default_access_ip_network_name = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.781903] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] default_availability_zone = nova {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.782063] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] default_ephemeral_format = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.782293] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.782450] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] default_schedule_zone = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.782602] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] disk_allocation_ratio = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.782757] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] enable_new_services = True {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.782927] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] enabled_apis = ['osapi_compute'] {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.783128] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] enabled_ssl_apis = [] {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.783305] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] flat_injected = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.783464] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] force_config_drive = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.783616] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] force_raw_images = True {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.783780] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] graceful_shutdown_timeout = 5 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.783935] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] heal_instance_info_cache_interval = 60 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.784176] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] host = cpu-1 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.784368] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] initial_cpu_allocation_ratio = 4.0 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.784530] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] initial_disk_allocation_ratio = 1.0 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.784686] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] initial_ram_allocation_ratio = 1.0 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.784898] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] injected_network_template = /opt/stack/nova/nova/virt/interfaces.template {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.785069] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] instance_build_timeout = 0 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.785230] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] instance_delete_interval = 300 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.785391] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] instance_format = [instance: %(uuid)s] {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.785551] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] instance_name_template = instance-%08x {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.785707] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] instance_usage_audit = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.785868] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] instance_usage_audit_period = month {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.786054] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] instance_uuid_format = [instance: %(uuid)s] {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.786248] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] instances_path = /opt/stack/data/nova/instances {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.786432] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] internal_service_availability_zone = internal {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.786585] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] key = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.786742] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] live_migration_retry_count = 30 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.786898] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] log_config_append = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.787071] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] log_date_format = %Y-%m-%d %H:%M:%S {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.787231] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] log_dir = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.787384] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] log_file = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.787508] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] log_options = True {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.787665] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] log_rotate_interval = 1 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.787827] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] log_rotate_interval_type = days {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.787987] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] log_rotation_type = none {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.788126] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] logging_context_format_string = %(color)s%(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(project_name)s %(user_name)s%(color)s] %(instance)s%(color)s%(message)s {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.788249] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] logging_debug_format_suffix = {{(pid=%(process)d) %(funcName)s %(pathname)s:%(lineno)d}} {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.788411] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] logging_default_format_string = %(color)s%(levelname)s %(name)s [-%(color)s] %(instance)s%(color)s%(message)s {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.788570] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] logging_exception_prefix = ERROR %(name)s %(instance)s {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.788692] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.788850] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] long_rpc_timeout = 1800 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.789012] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] max_concurrent_builds = 10 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.789172] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] max_concurrent_live_migrations = 1 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.789354] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] max_concurrent_snapshots = 5 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.789524] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] max_local_block_devices = 3 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.789681] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] max_logfile_count = 30 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.789835] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] max_logfile_size_mb = 200 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.789991] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] maximum_instance_delete_attempts = 5 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.790164] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] metadata_listen = 0.0.0.0 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.790329] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] metadata_listen_port = 8775 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.790492] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] metadata_workers = 2 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.790650] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] migrate_max_retries = -1 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.790809] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] mkisofs_cmd = genisoimage {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.791027] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] my_block_storage_ip = 10.180.1.21 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.791178] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] my_ip = 10.180.1.21 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.791352] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] network_allocate_retries = 0 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.791527] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.791689] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] osapi_compute_listen = 0.0.0.0 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.791849] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] osapi_compute_listen_port = 8774 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.792014] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] osapi_compute_unique_server_name_scope = {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.792184] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] osapi_compute_workers = 2 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.792343] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] password_length = 12 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.792499] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] periodic_enable = True {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.792654] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] periodic_fuzzy_delay = 60 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.792814] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] pointer_model = usbtablet {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.792972] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] preallocate_images = none {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.793161] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] publish_errors = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.793293] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] pybasedir = /opt/stack/nova {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.793447] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] ram_allocation_ratio = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.793602] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] rate_limit_burst = 0 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.793763] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] rate_limit_except_level = CRITICAL {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.793918] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] rate_limit_interval = 0 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.794084] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] reboot_timeout = 0 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.794264] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] reclaim_instance_interval = 0 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.794423] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] record = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.794588] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] reimage_timeout_per_gb = 60 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.794747] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] report_interval = 120 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.794901] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] rescue_timeout = 0 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.795066] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] reserved_host_cpus = 0 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.795223] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] reserved_host_disk_mb = 0 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.795377] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] reserved_host_memory_mb = 512 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.795534] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] reserved_huge_pages = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.795687] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] resize_confirm_window = 0 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.795842] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] resize_fs_using_block_device = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.795994] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] resume_guests_state_on_host_boot = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.796168] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] rootwrap_config = /etc/nova/rootwrap.conf {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.796326] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] rpc_response_timeout = 60 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.796479] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] run_external_periodic_tasks = True {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.796637] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] running_deleted_instance_action = reap {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.796788] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] running_deleted_instance_poll_interval = 1800 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.796941] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] running_deleted_instance_timeout = 0 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.797112] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] scheduler_instance_sync_interval = 120 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.797245] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] service_down_time = 300 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.797407] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] servicegroup_driver = db {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.797561] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] shelved_offload_time = 0 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.797711] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] shelved_poll_interval = 3600 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.797871] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] shutdown_timeout = 0 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.798035] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] source_is_ipv6 = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.798193] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] ssl_only = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.798471] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] state_path = /opt/stack/data/n-cpu-1 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.798638] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] sync_power_state_interval = 600 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.798795] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] sync_power_state_pool_size = 1000 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.798955] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] syslog_log_facility = LOG_USER {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.799118] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] tempdir = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.799276] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] timeout_nbd = 10 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.799434] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] transport_url = **** {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.799590] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] update_resources_interval = 0 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.799743] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] use_cow_images = True {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.799896] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] use_eventlog = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.800055] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] use_journal = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.800212] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] use_json = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.800364] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] use_rootwrap_daemon = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.800515] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] use_stderr = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.800662] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] use_syslog = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.800809] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] vcpu_pin_set = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.800967] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] vif_plugging_is_fatal = True {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.801167] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] vif_plugging_timeout = 300 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.801359] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] virt_mkfs = [] {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.801532] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] volume_usage_poll_interval = 0 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.801690] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] watch_log_file = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.801850] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] web = /usr/share/spice-html5 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 558.802043] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_concurrency.disable_process_locking = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.802330] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_concurrency.lock_path = /opt/stack/data/n-cpu-1 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.802504] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_messaging_metrics.metrics_buffer_size = 1000 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.802664] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_messaging_metrics.metrics_enabled = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.802829] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_messaging_metrics.metrics_process_name = {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.802990] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.803187] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.803369] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] api.auth_strategy = keystone {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.803531] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] api.compute_link_prefix = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.803694] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.803859] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] api.dhcp_domain = novalocal {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.804031] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] api.enable_instance_password = True {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.804200] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] api.glance_link_prefix = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.804364] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] api.instance_list_cells_batch_fixed_size = 100 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.804526] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] api.instance_list_cells_batch_strategy = distributed {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.804680] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] api.instance_list_per_project_cells = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.804833] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] api.list_records_by_skipping_down_cells = True {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.804985] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] api.local_metadata_per_cell = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.805158] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] api.max_limit = 1000 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.805321] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] api.metadata_cache_expiration = 15 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.805488] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] api.neutron_default_tenant_id = default {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.805648] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] api.use_forwarded_for = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.805806] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] api.use_neutron_default_nets = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.805970] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] api.vendordata_dynamic_connect_timeout = 5 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.806137] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] api.vendordata_dynamic_failure_fatal = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.806302] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] api.vendordata_dynamic_read_timeout = 5 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.806468] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] api.vendordata_dynamic_ssl_certfile = {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.806628] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] api.vendordata_dynamic_targets = [] {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.806787] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] api.vendordata_jsonfile_path = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.806962] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] api.vendordata_providers = ['StaticJSON'] {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.807161] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] cache.backend = dogpile.cache.memcached {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.807324] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] cache.backend_argument = **** {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.807486] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] cache.config_prefix = cache.oslo {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.807647] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] cache.dead_timeout = 60.0 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.807804] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] cache.debug_cache_backend = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.807958] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] cache.enable_retry_client = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.808125] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] cache.enable_socket_keepalive = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.808289] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] cache.enabled = True {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.808450] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] cache.expiration_time = 600 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.808608] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] cache.hashclient_retry_attempts = 2 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.808765] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] cache.hashclient_retry_delay = 1.0 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.808920] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] cache.memcache_dead_retry = 300 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.809090] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] cache.memcache_password = {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.809264] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] cache.memcache_pool_connection_get_timeout = 10 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.809422] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] cache.memcache_pool_flush_on_reconnect = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.809579] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] cache.memcache_pool_maxsize = 10 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.809735] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] cache.memcache_pool_unused_timeout = 60 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.809891] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] cache.memcache_sasl_enabled = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.810073] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] cache.memcache_servers = ['localhost:11211'] {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.810239] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] cache.memcache_socket_timeout = 1.0 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.810399] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] cache.memcache_username = {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.810557] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] cache.proxies = [] {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.810713] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] cache.retry_attempts = 2 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.810871] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] cache.retry_delay = 0.0 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.811045] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] cache.socket_keepalive_count = 1 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.811232] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] cache.socket_keepalive_idle = 1 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.811394] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] cache.socket_keepalive_interval = 1 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.811547] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] cache.tls_allowed_ciphers = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.811698] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] cache.tls_cafile = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.811850] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] cache.tls_certfile = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.812014] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] cache.tls_enabled = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.812173] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] cache.tls_keyfile = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.812339] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] cinder.auth_section = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.812518] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] cinder.auth_type = password {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.812672] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] cinder.cafile = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.812842] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] cinder.catalog_info = volumev3::publicURL {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.813094] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] cinder.certfile = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.813280] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] cinder.collect_timing = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.813473] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] cinder.cross_az_attach = True {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.813642] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] cinder.debug = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.813800] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] cinder.endpoint_template = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.813962] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] cinder.http_retries = 3 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.814149] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] cinder.insecure = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.814304] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] cinder.keyfile = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.814471] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] cinder.os_region_name = RegionOne {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.814628] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] cinder.split_loggers = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.814783] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] cinder.timeout = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.814951] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] compute.consecutive_build_service_disable_threshold = 10 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.815119] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] compute.cpu_dedicated_set = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.815272] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] compute.cpu_shared_set = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.815460] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] compute.image_type_exclude_list = [] {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.815634] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] compute.live_migration_wait_for_vif_plug = True {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.815797] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] compute.max_concurrent_disk_ops = 0 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.815953] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] compute.max_disk_devices_to_attach = -1 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.816124] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] compute.packing_host_numa_cells_allocation_strategy = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.816291] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] compute.provider_config_location = /etc/nova/provider_config/ {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.816448] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] compute.resource_provider_association_refresh = 300 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.816603] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] compute.shutdown_retry_interval = 10 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.816775] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.816948] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] conductor.workers = 2 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.817131] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] console.allowed_origins = [] {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.817288] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] console.ssl_ciphers = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.817452] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] console.ssl_minimum_version = default {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.817617] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] consoleauth.token_ttl = 600 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.817790] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] cyborg.cafile = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.817942] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] cyborg.certfile = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.818112] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] cyborg.collect_timing = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.818270] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] cyborg.connect_retries = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.818429] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] cyborg.connect_retry_delay = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.818579] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] cyborg.endpoint_override = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.818733] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] cyborg.insecure = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.818882] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] cyborg.keyfile = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.819044] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] cyborg.max_version = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.819200] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] cyborg.min_version = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.819350] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] cyborg.region_name = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.819500] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] cyborg.service_name = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.819659] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] cyborg.service_type = accelerator {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.819815] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] cyborg.split_loggers = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.819967] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] cyborg.status_code_retries = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.820130] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] cyborg.status_code_retry_delay = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.820284] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] cyborg.timeout = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.820459] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] cyborg.valid_interfaces = ['internal', 'public'] {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.820614] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] cyborg.version = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.820790] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] database.backend = sqlalchemy {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.820965] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] database.connection = **** {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.821168] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] database.connection_debug = 0 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.821347] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] database.connection_parameters = {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.821538] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] database.connection_recycle_time = 3600 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.821717] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] database.connection_trace = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.821878] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] database.db_inc_retry_interval = True {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.822051] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] database.db_max_retries = 20 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.822218] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] database.db_max_retry_interval = 10 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.822378] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] database.db_retry_interval = 1 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.822543] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] database.max_overflow = 50 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.822701] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] database.max_pool_size = 5 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.822865] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] database.max_retries = 10 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.823045] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] database.mysql_enable_ndb = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.823231] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] database.mysql_sql_mode = TRADITIONAL {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.823391] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] database.mysql_wsrep_sync_wait = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.823549] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] database.pool_timeout = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.823712] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] database.retry_interval = 10 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.823866] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] database.slave_connection = **** {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.824038] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] database.sqlite_synchronous = True {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.824203] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] database.use_db_reconnect = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.824379] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] api_database.backend = sqlalchemy {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.824550] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] api_database.connection = **** {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.824714] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] api_database.connection_debug = 0 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.824877] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] api_database.connection_parameters = {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.825044] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] api_database.connection_recycle_time = 3600 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.825212] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] api_database.connection_trace = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.825375] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] api_database.db_inc_retry_interval = True {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.825561] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] api_database.db_max_retries = 20 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.825722] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] api_database.db_max_retry_interval = 10 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.825881] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] api_database.db_retry_interval = 1 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.826056] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] api_database.max_overflow = 50 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.826223] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] api_database.max_pool_size = 5 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.826387] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] api_database.max_retries = 10 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.826544] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] api_database.mysql_enable_ndb = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.827014] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] api_database.mysql_sql_mode = TRADITIONAL {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.827196] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] api_database.mysql_wsrep_sync_wait = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.827365] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] api_database.pool_timeout = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.827541] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] api_database.retry_interval = 10 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.827693] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] api_database.slave_connection = **** {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.827856] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] api_database.sqlite_synchronous = True {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.828038] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] devices.enabled_mdev_types = [] {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.828219] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] ephemeral_storage_encryption.cipher = aes-xts-plain64 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.828379] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] ephemeral_storage_encryption.enabled = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.828542] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] ephemeral_storage_encryption.key_size = 512 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.828707] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] glance.api_servers = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.828866] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] glance.cafile = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.829033] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] glance.certfile = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.829201] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] glance.collect_timing = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.829356] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] glance.connect_retries = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.829508] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] glance.connect_retry_delay = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.829667] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] glance.debug = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.829826] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] glance.default_trusted_certificate_ids = [] {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.829984] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] glance.enable_certificate_validation = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.830154] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] glance.enable_rbd_download = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.830306] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] glance.endpoint_override = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.830468] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] glance.insecure = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.830623] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] glance.keyfile = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.830774] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] glance.max_version = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.830925] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] glance.min_version = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.831119] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] glance.num_retries = 3 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.831322] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] glance.rbd_ceph_conf = {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.831507] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] glance.rbd_connect_timeout = 5 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.831712] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] glance.rbd_pool = {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.831911] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] glance.rbd_user = {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.832109] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] glance.region_name = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.832276] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] glance.service_name = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.832444] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] glance.service_type = image {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.832608] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] glance.split_loggers = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.832762] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] glance.status_code_retries = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.832916] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] glance.status_code_retry_delay = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.833115] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] glance.timeout = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.833292] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] glance.valid_interfaces = ['internal', 'public'] {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.833457] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] glance.verify_glance_signatures = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.833659] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] glance.version = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.833880] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] guestfs.debug = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.834139] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] hyperv.config_drive_cdrom = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.834353] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] hyperv.config_drive_inject_password = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.834539] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] hyperv.dynamic_memory_ratio = 1.0 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.834705] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] hyperv.enable_instance_metrics_collection = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.834869] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] hyperv.enable_remotefx = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.835068] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] hyperv.instances_path_share = {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.835262] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] hyperv.iscsi_initiator_list = [] {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.835465] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] hyperv.limit_cpu_features = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.835651] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] hyperv.mounted_disk_query_retry_count = 10 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.835817] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] hyperv.mounted_disk_query_retry_interval = 5 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.835986] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] hyperv.power_state_check_timeframe = 60 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.836162] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] hyperv.power_state_event_polling_interval = 2 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.836329] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] hyperv.qemu_img_cmd = qemu-img.exe {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.836491] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] hyperv.use_multipath_io = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.836648] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] hyperv.volume_attach_retry_count = 10 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.836805] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] hyperv.volume_attach_retry_interval = 5 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.836957] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] hyperv.vswitch_name = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.837129] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] hyperv.wait_soft_reboot_seconds = 60 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.837295] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] mks.enabled = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.837656] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] mks.mksproxy_base_url = http://127.0.0.1:6090/ {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.837842] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] image_cache.manager_interval = 2400 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.838014] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] image_cache.precache_concurrency = 1 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.838184] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] image_cache.remove_unused_base_images = True {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.838353] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] image_cache.remove_unused_original_minimum_age_seconds = 86400 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.838517] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] image_cache.remove_unused_resized_minimum_age_seconds = 3600 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.838687] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] image_cache.subdirectory_name = _base {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.838924] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] ironic.api_max_retries = 60 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.839160] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] ironic.api_retry_interval = 2 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.839332] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] ironic.auth_section = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.839496] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] ironic.auth_type = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.839654] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] ironic.cafile = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.839810] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] ironic.certfile = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.839976] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] ironic.collect_timing = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.840156] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] ironic.connect_retries = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.840305] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] ironic.connect_retry_delay = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.840459] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] ironic.endpoint_override = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.840620] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] ironic.insecure = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.840774] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] ironic.keyfile = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.840929] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] ironic.max_version = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.841094] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] ironic.min_version = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.841252] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] ironic.partition_key = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.841412] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] ironic.peer_list = [] {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.841565] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] ironic.region_name = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.841725] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] ironic.serial_console_state_timeout = 10 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.841877] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] ironic.service_name = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.842055] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] ironic.service_type = baremetal {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.842218] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] ironic.split_loggers = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.842371] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] ironic.status_code_retries = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.842524] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] ironic.status_code_retry_delay = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.842724] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] ironic.timeout = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.842929] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] ironic.valid_interfaces = ['internal', 'public'] {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.843126] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] ironic.version = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.843319] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] key_manager.backend = nova.keymgr.conf_key_mgr.ConfKeyManager {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.843490] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] key_manager.fixed_key = **** {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.843667] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] barbican.auth_endpoint = http://localhost/identity/v3 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.843823] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] barbican.barbican_api_version = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.843978] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] barbican.barbican_endpoint = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.844158] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] barbican.barbican_endpoint_type = public {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.844315] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] barbican.barbican_region_name = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.844470] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] barbican.cafile = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.844621] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] barbican.certfile = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.844779] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] barbican.collect_timing = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.844934] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] barbican.insecure = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.845100] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] barbican.keyfile = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.845263] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] barbican.number_of_retries = 60 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.845457] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] barbican.retry_delay = 1 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.845659] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] barbican.send_service_user_token = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.845826] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] barbican.split_loggers = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.845983] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] barbican.timeout = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.846190] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] barbican.verify_ssl = True {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.846367] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] barbican.verify_ssl_path = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.846535] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] barbican_service_user.auth_section = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.846692] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] barbican_service_user.auth_type = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.846846] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] barbican_service_user.cafile = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.846997] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] barbican_service_user.certfile = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.847169] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] barbican_service_user.collect_timing = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.847341] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] barbican_service_user.insecure = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.847505] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] barbican_service_user.keyfile = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.847672] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] barbican_service_user.split_loggers = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.847828] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] barbican_service_user.timeout = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.847989] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] vault.approle_role_id = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.848188] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] vault.approle_secret_id = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.848313] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] vault.cafile = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.848465] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] vault.certfile = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.848629] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] vault.collect_timing = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.848786] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] vault.insecure = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.848938] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] vault.keyfile = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.849144] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] vault.kv_mountpoint = secret {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.850188] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] vault.kv_version = 2 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.850188] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] vault.namespace = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.850188] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] vault.root_token_id = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.850188] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] vault.split_loggers = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.850188] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] vault.ssl_ca_crt_file = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.850188] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] vault.timeout = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.850188] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] vault.use_ssl = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.850376] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] vault.vault_url = http://127.0.0.1:8200 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.850504] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] keystone.cafile = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.850659] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] keystone.certfile = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.850817] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] keystone.collect_timing = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.850973] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] keystone.connect_retries = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.851140] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] keystone.connect_retry_delay = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.851295] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] keystone.endpoint_override = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.851454] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] keystone.insecure = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.851606] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] keystone.keyfile = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.851758] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] keystone.max_version = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.851909] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] keystone.min_version = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.852069] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] keystone.region_name = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.852222] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] keystone.service_name = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.852384] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] keystone.service_type = identity {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.852540] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] keystone.split_loggers = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.852693] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] keystone.status_code_retries = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.852846] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] keystone.status_code_retry_delay = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.853013] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] keystone.timeout = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.853224] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] keystone.valid_interfaces = ['internal', 'public'] {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.853389] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] keystone.version = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.853582] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.connection_uri = {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.853738] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.cpu_mode = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.853898] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.cpu_model_extra_flags = [] {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.854090] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.cpu_models = [] {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.854274] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.cpu_power_governor_high = performance {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.854437] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.cpu_power_governor_low = powersave {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.854594] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.cpu_power_management = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.854760] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.cpu_power_management_strategy = cpu_state {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.854920] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.device_detach_attempts = 8 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.855090] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.device_detach_timeout = 20 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.855253] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.disk_cachemodes = [] {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.855428] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.disk_prefix = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.855604] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.enabled_perf_events = [] {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.855766] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.file_backed_memory = 0 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.855933] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.gid_maps = [] {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.856098] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.hw_disk_discard = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.856256] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.hw_machine_type = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.856421] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.images_rbd_ceph_conf = {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.856583] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.images_rbd_glance_copy_poll_interval = 15 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.856745] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.images_rbd_glance_copy_timeout = 600 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.856906] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.images_rbd_glance_store_name = {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.857079] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.images_rbd_pool = rbd {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.857247] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.images_type = default {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.857399] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.images_volume_group = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.857587] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.inject_key = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.857755] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.inject_partition = -2 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.857913] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.inject_password = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.858085] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.iscsi_iface = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.858248] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.iser_use_multipath = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.858407] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.live_migration_bandwidth = 0 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.858564] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.live_migration_completion_timeout = 800 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.858721] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.live_migration_downtime = 500 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.858877] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.live_migration_downtime_delay = 75 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.859042] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.live_migration_downtime_steps = 10 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.859202] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.live_migration_inbound_addr = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.859359] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.live_migration_permit_auto_converge = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.859516] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.live_migration_permit_post_copy = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.859680] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.live_migration_scheme = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.859849] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.live_migration_timeout_action = abort {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.860013] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.live_migration_tunnelled = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.860183] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.live_migration_uri = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.860343] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.live_migration_with_native_tls = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.860496] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.max_queues = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.860655] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.mem_stats_period_seconds = 10 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.860810] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.nfs_mount_options = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.861137] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.nfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.861310] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.num_aoe_discover_tries = 3 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.861473] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.num_iser_scan_tries = 5 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.861628] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.num_memory_encrypted_guests = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.861788] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.num_nvme_discover_tries = 5 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.861946] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.num_pcie_ports = 0 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.862121] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.num_volume_scan_tries = 5 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.862284] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.pmem_namespaces = [] {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.862438] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.quobyte_client_cfg = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.862731] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.quobyte_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.862899] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.rbd_connect_timeout = 5 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.863094] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.rbd_destroy_volume_retries = 12 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.863267] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.rbd_destroy_volume_retry_interval = 5 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.863427] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.rbd_secret_uuid = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.863580] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.rbd_user = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.863738] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.realtime_scheduler_priority = 1 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.863902] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.remote_filesystem_transport = ssh {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.864083] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.rescue_image_id = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.864251] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.rescue_kernel_id = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.864403] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.rescue_ramdisk_id = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.864790] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.rng_dev_path = /dev/urandom {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.864966] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.rx_queue_size = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.865147] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.smbfs_mount_options = {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.865447] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.smbfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.865629] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.snapshot_compression = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.865946] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.snapshot_image_format = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.866017] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.snapshots_directory = /opt/stack/data/nova/instances/snapshots {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.866167] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.sparse_logical_volumes = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.866330] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.swtpm_enabled = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.866491] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.swtpm_group = tss {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.866653] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.swtpm_user = tss {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.867038] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.sysinfo_serial = unique {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.867217] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.tx_queue_size = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.867383] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.uid_maps = [] {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.867588] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.use_virtio_for_bridges = True {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.867762] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.virt_type = kvm {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.867930] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.volume_clear = zero {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.868104] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.volume_clear_size = 0 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.868272] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.volume_use_multipath = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.868429] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.vzstorage_cache_path = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.868593] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.868756] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.vzstorage_mount_group = qemu {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.868917] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.vzstorage_mount_opts = [] {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.869089] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.vzstorage_mount_perms = 0770 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.869368] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.vzstorage_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.869564] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.vzstorage_mount_user = stack {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.869910] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] libvirt.wait_soft_reboot_seconds = 120 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.869910] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] neutron.auth_section = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.870077] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] neutron.auth_type = password {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.870240] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] neutron.cafile = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.870402] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] neutron.certfile = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.870576] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] neutron.collect_timing = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.870733] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] neutron.connect_retries = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.870891] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] neutron.connect_retry_delay = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.871066] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] neutron.default_floating_pool = public {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.871226] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] neutron.endpoint_override = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.871385] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] neutron.extension_sync_interval = 600 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.871544] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] neutron.http_retries = 3 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.871780] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] neutron.insecure = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.871936] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] neutron.keyfile = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.872106] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] neutron.max_version = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.872278] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] neutron.metadata_proxy_shared_secret = **** {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.872434] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] neutron.min_version = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.872595] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] neutron.ovs_bridge = br-int {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.872755] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] neutron.physnets = [] {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.872919] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] neutron.region_name = RegionOne {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.873117] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] neutron.service_metadata_proxy = True {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.873288] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] neutron.service_name = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.873454] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] neutron.service_type = network {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.873614] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] neutron.split_loggers = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.873768] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] neutron.status_code_retries = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.873920] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] neutron.status_code_retry_delay = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.874087] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] neutron.timeout = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.874265] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] neutron.valid_interfaces = ['internal', 'public'] {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.874423] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] neutron.version = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.874588] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] notifications.bdms_in_notifications = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.874759] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] notifications.default_level = INFO {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.874927] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] notifications.notification_format = unversioned {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.875095] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] notifications.notify_on_state_change = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.875267] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] notifications.versioned_notifications_topics = ['versioned_notifications'] {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.875460] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] pci.alias = [] {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.875645] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] pci.device_spec = [] {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.875809] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] pci.report_in_placement = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.875976] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] placement.auth_section = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.876158] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] placement.auth_type = password {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.876325] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] placement.auth_url = http://10.180.1.21/identity {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.876480] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] placement.cafile = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.876631] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] placement.certfile = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.876786] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] placement.collect_timing = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.876940] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] placement.connect_retries = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.877103] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] placement.connect_retry_delay = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.877267] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] placement.default_domain_id = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.877417] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] placement.default_domain_name = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.877566] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] placement.domain_id = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.877716] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] placement.domain_name = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.877869] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] placement.endpoint_override = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.878034] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] placement.insecure = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.878191] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] placement.keyfile = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.878343] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] placement.max_version = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.878494] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] placement.min_version = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.878654] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] placement.password = **** {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.878808] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] placement.project_domain_id = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.878965] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] placement.project_domain_name = Default {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.879142] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] placement.project_id = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.879310] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] placement.project_name = service {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.879471] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] placement.region_name = RegionOne {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.879623] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] placement.service_name = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.879784] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] placement.service_type = placement {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.879940] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] placement.split_loggers = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.880106] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] placement.status_code_retries = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.880265] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] placement.status_code_retry_delay = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.880417] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] placement.system_scope = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.880569] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] placement.timeout = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.880723] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] placement.trust_id = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.880872] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] placement.user_domain_id = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.881042] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] placement.user_domain_name = Default {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.881203] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] placement.user_id = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.881370] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] placement.username = placement {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.881568] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] placement.valid_interfaces = ['internal', 'public'] {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.881745] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] placement.version = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.881922] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] quota.cores = 20 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.882102] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] quota.count_usage_from_placement = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.882274] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] quota.driver = nova.quota.DbQuotaDriver {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.882445] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] quota.injected_file_content_bytes = 10240 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.882608] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] quota.injected_file_path_length = 255 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.882772] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] quota.injected_files = 5 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.882936] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] quota.instances = 10 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.883125] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] quota.key_pairs = 100 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.883300] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] quota.metadata_items = 128 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.883467] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] quota.ram = 51200 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.883628] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] quota.recheck_quota = True {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.883792] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] quota.server_group_members = 10 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.883953] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] quota.server_groups = 10 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.884134] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] rdp.enabled = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.884450] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.884629] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] scheduler.discover_hosts_in_cells_interval = -1 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.884793] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] scheduler.enable_isolated_aggregate_filtering = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.884963] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] scheduler.image_metadata_prefilter = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.885139] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] scheduler.limit_tenants_to_placement_aggregate = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.885305] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] scheduler.max_attempts = 3 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.885494] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] scheduler.max_placement_results = 1000 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.885667] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] scheduler.placement_aggregate_required_for_tenants = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.885836] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] scheduler.query_placement_for_availability_zone = True {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.885996] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] scheduler.query_placement_for_image_type_support = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.886170] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] scheduler.query_placement_for_routed_network_aggregates = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.886345] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] scheduler.workers = 2 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.886519] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] filter_scheduler.aggregate_image_properties_isolation_namespace = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.886686] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] filter_scheduler.aggregate_image_properties_isolation_separator = . {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.886860] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.887039] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] filter_scheduler.build_failure_weight_multiplier = 1000000.0 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.887211] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] filter_scheduler.cpu_weight_multiplier = 1.0 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.887374] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.887554] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] filter_scheduler.disk_weight_multiplier = 1.0 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.887711] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter', 'SameHostFilter', 'DifferentHostFilter'] {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.887874] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] filter_scheduler.host_subset_size = 1 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.888044] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] filter_scheduler.image_properties_default_architecture = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.888208] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] filter_scheduler.io_ops_weight_multiplier = -1.0 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.888368] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] filter_scheduler.isolated_hosts = [] {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.888531] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] filter_scheduler.isolated_images = [] {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.888688] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] filter_scheduler.max_instances_per_host = 50 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.888844] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] filter_scheduler.max_io_ops_per_host = 8 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.888999] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] filter_scheduler.pci_in_placement = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.889171] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] filter_scheduler.pci_weight_multiplier = 1.0 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.889328] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] filter_scheduler.ram_weight_multiplier = 1.0 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.889485] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.889638] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] filter_scheduler.shuffle_best_same_weighed_hosts = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.889796] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] filter_scheduler.soft_affinity_weight_multiplier = 1.0 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.889955] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.890125] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] filter_scheduler.track_instance_changes = True {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.890298] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.890465] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] metrics.required = True {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.890614] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] metrics.weight_multiplier = 1.0 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.890770] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] metrics.weight_of_unavailable = -10000.0 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.890927] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] metrics.weight_setting = [] {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.891241] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] serial_console.base_url = ws://127.0.0.1:6083/ {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.891415] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] serial_console.enabled = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.891586] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] serial_console.port_range = 10000:20000 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.891751] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] serial_console.proxyclient_address = 127.0.0.1 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.891914] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] serial_console.serialproxy_host = 0.0.0.0 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.892087] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] serial_console.serialproxy_port = 6083 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.892254] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] service_user.auth_section = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.892421] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] service_user.auth_type = password {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.892577] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] service_user.cafile = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.892729] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] service_user.certfile = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.892885] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] service_user.collect_timing = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.893069] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] service_user.insecure = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.893234] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] service_user.keyfile = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.893404] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] service_user.send_service_user_token = True {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.893585] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] service_user.split_loggers = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.893762] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] service_user.timeout = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.893932] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] spice.agent_enabled = True {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.894136] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] spice.enabled = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.894445] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.894641] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] spice.html5proxy_host = 0.0.0.0 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.894810] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] spice.html5proxy_port = 6082 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.894969] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] spice.image_compression = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.895139] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] spice.jpeg_compression = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.895298] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] spice.playback_compression = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.895466] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] spice.server_listen = 127.0.0.1 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.895629] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] spice.server_proxyclient_address = 127.0.0.1 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.895793] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] spice.streaming_mode = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.895948] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] spice.zlib_compression = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.896120] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] upgrade_levels.baseapi = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.896278] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] upgrade_levels.cert = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.896441] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] upgrade_levels.compute = auto {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.896597] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] upgrade_levels.conductor = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.896759] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] upgrade_levels.scheduler = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.896922] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] vendordata_dynamic_auth.auth_section = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.897093] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] vendordata_dynamic_auth.auth_type = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.897252] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] vendordata_dynamic_auth.cafile = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.897404] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] vendordata_dynamic_auth.certfile = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.897561] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] vendordata_dynamic_auth.collect_timing = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.897718] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] vendordata_dynamic_auth.insecure = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.897869] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] vendordata_dynamic_auth.keyfile = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.898036] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] vendordata_dynamic_auth.split_loggers = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.898193] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] vendordata_dynamic_auth.timeout = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.898368] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] vmware.api_retry_count = 10 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.898524] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] vmware.ca_file = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.898689] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] vmware.cache_prefix = devstack-image-cache {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.898850] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] vmware.cluster_name = testcl1 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.899012] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] vmware.connection_pool_size = 10 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.899173] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] vmware.console_delay_seconds = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.899338] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] vmware.datastore_regex = ^datastore.* {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.899541] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] vmware.host_ip = vc1.osci.c.eu-de-1.cloud.sap {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.899707] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] vmware.host_password = **** {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.899868] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] vmware.host_port = 443 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.900042] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] vmware.host_username = administrator@vsphere.local {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.900211] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] vmware.insecure = True {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.900366] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] vmware.integration_bridge = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.900525] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] vmware.maximum_objects = 100 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.900675] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] vmware.pbm_default_policy = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.900831] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] vmware.pbm_enabled = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.900981] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] vmware.pbm_wsdl_location = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.901156] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] vmware.serial_log_dir = /opt/vmware/vspc {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.901313] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] vmware.serial_port_proxy_uri = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.901465] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] vmware.serial_port_service_uri = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.901622] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] vmware.task_poll_interval = 0.5 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.901788] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] vmware.use_linked_clone = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.901949] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] vmware.vnc_keymap = en-us {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.902122] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] vmware.vnc_port = 5900 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.902310] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] vmware.vnc_port_total = 10000 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.902494] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] vnc.auth_schemes = ['none'] {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.902667] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] vnc.enabled = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.902955] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] vnc.novncproxy_base_url = http://127.0.0.1:6080/vnc_auto.html {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.903173] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] vnc.novncproxy_host = 0.0.0.0 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.903352] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] vnc.novncproxy_port = 6080 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.903528] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] vnc.server_listen = 127.0.0.1 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.903697] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] vnc.server_proxyclient_address = 127.0.0.1 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.903853] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] vnc.vencrypt_ca_certs = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.904013] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] vnc.vencrypt_client_cert = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.904178] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] vnc.vencrypt_client_key = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.904353] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] workarounds.disable_compute_service_check_for_ffu = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.904510] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] workarounds.disable_deep_image_inspection = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.904667] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] workarounds.disable_fallback_pcpu_query = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.904821] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] workarounds.disable_group_policy_check_upcall = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.904974] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] workarounds.disable_libvirt_livesnapshot = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.905142] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] workarounds.disable_rootwrap = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.905299] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] workarounds.enable_numa_live_migration = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.905482] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] workarounds.enable_qemu_monitor_announce_self = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.905651] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.905809] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] workarounds.handle_virt_lifecycle_events = True {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.905964] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] workarounds.libvirt_disable_apic = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.906131] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] workarounds.never_download_image_if_on_rbd = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.906293] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] workarounds.qemu_monitor_announce_self_count = 3 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.906448] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] workarounds.qemu_monitor_announce_self_interval = 1 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.906603] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] workarounds.reserve_disk_resource_for_image_cache = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.906757] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] workarounds.skip_cpu_compare_at_startup = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.906909] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] workarounds.skip_cpu_compare_on_dest = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.907074] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] workarounds.skip_hypervisor_version_check_on_lm = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.907233] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] workarounds.skip_reserve_in_use_ironic_nodes = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.907384] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] workarounds.unified_limits_count_pcpu_as_vcpu = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.907541] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.907720] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] wsgi.api_paste_config = /etc/nova/api-paste.ini {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.907882] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] wsgi.client_socket_timeout = 900 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.908058] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] wsgi.default_pool_size = 1000 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.908224] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] wsgi.keep_alive = True {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.908382] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] wsgi.max_header_line = 16384 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.908540] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] wsgi.secure_proxy_ssl_header = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.908694] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] wsgi.ssl_ca_file = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.908845] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] wsgi.ssl_cert_file = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.908997] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] wsgi.ssl_key_file = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.909168] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] wsgi.tcp_keepidle = 600 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.909336] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.909497] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] zvm.ca_file = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.909652] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] zvm.cloud_connector_url = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.909929] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] zvm.image_tmp_path = /opt/stack/data/n-cpu-1/images {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.910108] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] zvm.reachable_timeout = 300 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.910287] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_policy.enforce_new_defaults = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.910452] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_policy.enforce_scope = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.910620] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_policy.policy_default_rule = default {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.910793] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_policy.policy_dirs = ['policy.d'] {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.910960] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_policy.policy_file = policy.yaml {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.911137] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_policy.remote_content_type = application/x-www-form-urlencoded {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.911294] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_policy.remote_ssl_ca_crt_file = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.911444] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_policy.remote_ssl_client_crt_file = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.911594] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_policy.remote_ssl_client_key_file = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.911753] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_policy.remote_ssl_verify_server_crt = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.911910] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_versionedobjects.fatal_exception_format_errors = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.912086] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.912261] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] profiler.connection_string = messaging:// {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.912422] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] profiler.enabled = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.912586] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] profiler.es_doc_type = notification {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.912746] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] profiler.es_scroll_size = 10000 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.912907] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] profiler.es_scroll_time = 2m {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.913101] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] profiler.filter_error_trace = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.913269] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] profiler.hmac_keys = SECRET_KEY {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.913435] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] profiler.sentinel_service_name = mymaster {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.913600] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] profiler.socket_timeout = 0.1 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.913758] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] profiler.trace_sqlalchemy = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.913917] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] remote_debug.host = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.914105] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] remote_debug.port = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.914309] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_messaging_rabbit.amqp_auto_delete = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.914527] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_messaging_rabbit.amqp_durable_queues = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.914701] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_messaging_rabbit.conn_pool_min_size = 2 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.914861] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_messaging_rabbit.conn_pool_ttl = 1200 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.915028] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_messaging_rabbit.direct_mandatory_flag = True {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.915190] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_messaging_rabbit.enable_cancel_on_failover = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.915350] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_messaging_rabbit.heartbeat_in_pthread = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.915547] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_messaging_rabbit.heartbeat_rate = 2 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.915715] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.915869] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_messaging_rabbit.kombu_compression = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.916046] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_messaging_rabbit.kombu_failover_strategy = round-robin {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.916222] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.916385] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.916548] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_messaging_rabbit.rabbit_ha_queues = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.916705] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_messaging_rabbit.rabbit_interval_max = 30 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.916873] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.917042] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.917203] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.917365] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.917524] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.917681] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_messaging_rabbit.rabbit_quorum_queue = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.917840] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_messaging_rabbit.rabbit_retry_backoff = 2 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.917998] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_messaging_rabbit.rabbit_retry_interval = 1 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.918170] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.918333] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_messaging_rabbit.rpc_conn_pool_size = 30 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.918493] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_messaging_rabbit.ssl = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.918658] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_messaging_rabbit.ssl_ca_file = {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.918821] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_messaging_rabbit.ssl_cert_file = {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.918976] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_messaging_rabbit.ssl_enforce_fips_mode = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.919153] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_messaging_rabbit.ssl_key_file = {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.919319] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_messaging_rabbit.ssl_version = {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.919500] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_messaging_notifications.driver = ['messagingv2'] {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.919663] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_messaging_notifications.retry = -1 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.919841] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_messaging_notifications.topics = ['notifications'] {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.920029] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_messaging_notifications.transport_url = **** {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.920191] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_limit.auth_section = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.920351] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_limit.auth_type = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.920507] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_limit.cafile = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.920659] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_limit.certfile = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.920815] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_limit.collect_timing = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.920968] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_limit.connect_retries = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.921133] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_limit.connect_retry_delay = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.921285] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_limit.endpoint_id = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.921435] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_limit.endpoint_override = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.921591] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_limit.insecure = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.921742] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_limit.keyfile = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.921894] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_limit.max_version = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.922104] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_limit.min_version = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.922274] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_limit.region_name = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.922426] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_limit.service_name = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.922578] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_limit.service_type = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.922736] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_limit.split_loggers = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.922892] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_limit.status_code_retries = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.923071] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_limit.status_code_retry_delay = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.923240] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_limit.timeout = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.923398] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_limit.valid_interfaces = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.923551] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_limit.version = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.923710] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_reports.file_event_handler = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.923868] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_reports.file_event_handler_interval = 1 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.924034] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] oslo_reports.log_dir = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.924201] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] vif_plug_linux_bridge_privileged.capabilities = [12] {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.924358] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] vif_plug_linux_bridge_privileged.group = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.924511] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] vif_plug_linux_bridge_privileged.helper_command = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.924671] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.924829] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] vif_plug_linux_bridge_privileged.thread_pool_size = 8 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.924983] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] vif_plug_linux_bridge_privileged.user = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.925159] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] vif_plug_ovs_privileged.capabilities = [12, 1] {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.925316] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] vif_plug_ovs_privileged.group = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.925531] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] vif_plug_ovs_privileged.helper_command = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.925726] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.925890] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] vif_plug_ovs_privileged.thread_pool_size = 8 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.926057] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] vif_plug_ovs_privileged.user = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.926229] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] os_vif_linux_bridge.flat_interface = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.926402] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] os_vif_linux_bridge.forward_bridge_interface = ['all'] {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.926568] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] os_vif_linux_bridge.iptables_bottom_regex = {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.926734] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] os_vif_linux_bridge.iptables_drop_action = DROP {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.926898] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] os_vif_linux_bridge.iptables_top_regex = {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.927071] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] os_vif_linux_bridge.network_device_mtu = 1500 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.927237] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] os_vif_linux_bridge.use_ipv6 = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.927397] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] os_vif_linux_bridge.vlan_interface = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.927570] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] os_vif_ovs.isolate_vif = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.927736] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] os_vif_ovs.network_device_mtu = 1500 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.927898] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] os_vif_ovs.ovs_vsctl_timeout = 120 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.928071] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.928239] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] os_vif_ovs.ovsdb_interface = native {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.928395] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] os_vif_ovs.per_port_bridge = False {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.928551] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] os_brick.lock_path = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.928715] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] privsep_osbrick.capabilities = [21] {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.928867] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] privsep_osbrick.group = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.929030] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] privsep_osbrick.helper_command = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.929195] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] privsep_osbrick.logger_name = os_brick.privileged {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.929355] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] privsep_osbrick.thread_pool_size = 8 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.929508] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] privsep_osbrick.user = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.929673] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.929824] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] nova_sys_admin.group = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.929974] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] nova_sys_admin.helper_command = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.930146] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] nova_sys_admin.logger_name = oslo_privsep.daemon {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.930306] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] nova_sys_admin.thread_pool_size = 8 {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.930456] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] nova_sys_admin.user = None {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.930581] env[59369]: DEBUG oslo_service.service [None req-4424344c-1ecd-486e-88ae-b027a0686997 None None] ******************************************************************************** {{(pid=59369) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2613}} [ 558.930982] env[59369]: INFO nova.service [-] Starting compute node (version 0.1.0) [ 558.940284] env[59369]: INFO nova.virt.node [None req-c66e99e4-5d07-4c33-a906-a2db509efaf3 None None] Generated node identity 6ace6145-3535-4f74-aa29-80f64a201369 [ 558.940501] env[59369]: INFO nova.virt.node [None req-c66e99e4-5d07-4c33-a906-a2db509efaf3 None None] Wrote node identity 6ace6145-3535-4f74-aa29-80f64a201369 to /opt/stack/data/n-cpu-1/compute_id [ 558.952025] env[59369]: WARNING nova.compute.manager [None req-c66e99e4-5d07-4c33-a906-a2db509efaf3 None None] Compute nodes ['6ace6145-3535-4f74-aa29-80f64a201369'] for host cpu-1 were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning. [ 558.982125] env[59369]: INFO nova.compute.manager [None req-c66e99e4-5d07-4c33-a906-a2db509efaf3 None None] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host [ 559.002925] env[59369]: WARNING nova.compute.manager [None req-c66e99e4-5d07-4c33-a906-a2db509efaf3 None None] No compute node record found for host cpu-1. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host cpu-1 could not be found. [ 559.003175] env[59369]: DEBUG oslo_concurrency.lockutils [None req-c66e99e4-5d07-4c33-a906-a2db509efaf3 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 559.003376] env[59369]: DEBUG oslo_concurrency.lockutils [None req-c66e99e4-5d07-4c33-a906-a2db509efaf3 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 559.003512] env[59369]: DEBUG oslo_concurrency.lockutils [None req-c66e99e4-5d07-4c33-a906-a2db509efaf3 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 559.003780] env[59369]: DEBUG nova.compute.resource_tracker [None req-c66e99e4-5d07-4c33-a906-a2db509efaf3 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59369) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 559.004724] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c8e1cbaa-95db-4c10-9480-8a1b4d99cb28 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 559.013286] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-064f6105-f4df-4bbd-9f70-51c4e7d9528d {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 559.027457] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b57b441f-a4dc-4d87-a63c-d5dec5637f15 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 559.034082] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ec481a81-25a3-45ec-9fb7-4b2f867fb8a8 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 559.061923] env[59369]: DEBUG nova.compute.resource_tracker [None req-c66e99e4-5d07-4c33-a906-a2db509efaf3 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181830MB free_disk=116GB free_vcpus=48 pci_devices=None {{(pid=59369) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 559.062085] env[59369]: DEBUG oslo_concurrency.lockutils [None req-c66e99e4-5d07-4c33-a906-a2db509efaf3 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 559.062275] env[59369]: DEBUG oslo_concurrency.lockutils [None req-c66e99e4-5d07-4c33-a906-a2db509efaf3 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 559.074483] env[59369]: WARNING nova.compute.resource_tracker [None req-c66e99e4-5d07-4c33-a906-a2db509efaf3 None None] No compute node record for cpu-1:6ace6145-3535-4f74-aa29-80f64a201369: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 6ace6145-3535-4f74-aa29-80f64a201369 could not be found. [ 559.087564] env[59369]: INFO nova.compute.resource_tracker [None req-c66e99e4-5d07-4c33-a906-a2db509efaf3 None None] Compute node record created for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 with uuid: 6ace6145-3535-4f74-aa29-80f64a201369 [ 559.139231] env[59369]: DEBUG nova.compute.resource_tracker [None req-c66e99e4-5d07-4c33-a906-a2db509efaf3 None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=59369) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 559.139388] env[59369]: DEBUG nova.compute.resource_tracker [None req-c66e99e4-5d07-4c33-a906-a2db509efaf3 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=200GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=59369) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 559.236792] env[59369]: INFO nova.scheduler.client.report [None req-c66e99e4-5d07-4c33-a906-a2db509efaf3 None None] [req-d662a61a-7ed6-4640-ab9a-cd6822344a8c] Created resource provider record via placement API for resource provider with UUID 6ace6145-3535-4f74-aa29-80f64a201369 and name domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28. [ 559.252846] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a13120ee-bb86-47f7-8a70-bab40f558d65 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 559.259997] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d9004a81-aab6-421b-bd81-e2e6eba26e19 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 559.288372] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3d429caf-c5eb-4934-beda-a8c211d0897d {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 559.294913] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5d2bd33c-af0c-418b-bff5-0bedc1a3a0e7 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 559.307381] env[59369]: DEBUG nova.compute.provider_tree [None req-c66e99e4-5d07-4c33-a906-a2db509efaf3 None None] Updating inventory in ProviderTree for provider 6ace6145-3535-4f74-aa29-80f64a201369 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 116, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59369) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 559.342836] env[59369]: DEBUG nova.scheduler.client.report [None req-c66e99e4-5d07-4c33-a906-a2db509efaf3 None None] Updated inventory for provider 6ace6145-3535-4f74-aa29-80f64a201369 with generation 0 in Placement from set_inventory_for_provider using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 116, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59369) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:957}} [ 559.343088] env[59369]: DEBUG nova.compute.provider_tree [None req-c66e99e4-5d07-4c33-a906-a2db509efaf3 None None] Updating resource provider 6ace6145-3535-4f74-aa29-80f64a201369 generation from 0 to 1 during operation: update_inventory {{(pid=59369) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 559.343240] env[59369]: DEBUG nova.compute.provider_tree [None req-c66e99e4-5d07-4c33-a906-a2db509efaf3 None None] Updating inventory in ProviderTree for provider 6ace6145-3535-4f74-aa29-80f64a201369 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 116, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59369) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 559.384519] env[59369]: DEBUG nova.compute.provider_tree [None req-c66e99e4-5d07-4c33-a906-a2db509efaf3 None None] Updating resource provider 6ace6145-3535-4f74-aa29-80f64a201369 generation from 1 to 2 during operation: update_traits {{(pid=59369) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 559.401968] env[59369]: DEBUG nova.compute.resource_tracker [None req-c66e99e4-5d07-4c33-a906-a2db509efaf3 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59369) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 559.402157] env[59369]: DEBUG oslo_concurrency.lockutils [None req-c66e99e4-5d07-4c33-a906-a2db509efaf3 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.340s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 559.402320] env[59369]: DEBUG nova.service [None req-c66e99e4-5d07-4c33-a906-a2db509efaf3 None None] Creating RPC server for service compute {{(pid=59369) start /opt/stack/nova/nova/service.py:182}} [ 559.416521] env[59369]: DEBUG nova.service [None req-c66e99e4-5d07-4c33-a906-a2db509efaf3 None None] Join ServiceGroup membership for this service compute {{(pid=59369) start /opt/stack/nova/nova/service.py:199}} [ 559.416696] env[59369]: DEBUG nova.servicegroup.drivers.db [None req-c66e99e4-5d07-4c33-a906-a2db509efaf3 None None] DB_Driver: join new ServiceGroup member cpu-1 to the compute group, service = {{(pid=59369) join /opt/stack/nova/nova/servicegroup/drivers/db.py:44}} [ 599.953660] env[59369]: DEBUG oslo_concurrency.lockutils [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] Acquiring lock "02929ad5-b2f4-4a80-a606-2a0c9b6222e4" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 599.953660] env[59369]: DEBUG oslo_concurrency.lockutils [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] Lock "02929ad5-b2f4-4a80-a606-2a0c9b6222e4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 599.972228] env[59369]: DEBUG nova.compute.manager [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] Starting instance... {{(pid=59369) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 600.093173] env[59369]: DEBUG oslo_concurrency.lockutils [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] Acquiring lock "31c17493-263e-4d08-b669-e901b07060d5" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 600.093281] env[59369]: DEBUG oslo_concurrency.lockutils [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] Lock "31c17493-263e-4d08-b669-e901b07060d5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.002s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 600.107072] env[59369]: DEBUG oslo_concurrency.lockutils [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 600.107072] env[59369]: DEBUG oslo_concurrency.lockutils [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 600.109713] env[59369]: INFO nova.compute.claims [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 600.114386] env[59369]: DEBUG nova.compute.manager [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] [instance: 31c17493-263e-4d08-b669-e901b07060d5] Starting instance... {{(pid=59369) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 600.211783] env[59369]: DEBUG oslo_concurrency.lockutils [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 600.239943] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-24cad5db-2e57-4639-8711-7d77b00adf79 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 600.248209] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bc6cafeb-e7f0-491d-b7be-d06805c1a3a4 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 600.279672] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-355dc9e4-e404-4d83-b50b-cd89a65e091e {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 600.287993] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-292be21a-0c8c-4cef-988d-2d4bfcc2851f {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 600.302241] env[59369]: DEBUG nova.compute.provider_tree [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] Inventory has not changed in ProviderTree for provider: 6ace6145-3535-4f74-aa29-80f64a201369 {{(pid=59369) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 600.321140] env[59369]: DEBUG nova.scheduler.client.report [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] Inventory has not changed for provider 6ace6145-3535-4f74-aa29-80f64a201369 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 116, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59369) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 600.343286] env[59369]: DEBUG oslo_concurrency.lockutils [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.234s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 600.343286] env[59369]: DEBUG nova.compute.manager [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] Start building networks asynchronously for instance. {{(pid=59369) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 600.348855] env[59369]: DEBUG oslo_concurrency.lockutils [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.137s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 600.353511] env[59369]: INFO nova.compute.claims [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] [instance: 31c17493-263e-4d08-b669-e901b07060d5] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 600.399899] env[59369]: DEBUG nova.compute.utils [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] Using /dev/sd instead of None {{(pid=59369) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 600.405799] env[59369]: DEBUG nova.compute.manager [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] Allocating IP information in the background. {{(pid=59369) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 600.405799] env[59369]: DEBUG nova.network.neutron [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] allocate_for_instance() {{(pid=59369) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 600.423757] env[59369]: DEBUG nova.compute.manager [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] Start building block device mappings for instance. {{(pid=59369) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 600.526355] env[59369]: DEBUG oslo_concurrency.lockutils [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] Acquiring lock "b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 600.526511] env[59369]: DEBUG oslo_concurrency.lockutils [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] Lock "b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 600.531691] env[59369]: DEBUG nova.compute.manager [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] Start spawning the instance on the hypervisor. {{(pid=59369) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 600.544147] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-33cc8df1-dd95-4ccb-95ca-bc1271993170 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 600.548843] env[59369]: DEBUG nova.compute.manager [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] Starting instance... {{(pid=59369) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 600.557596] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e00ed29f-0e32-4543-b935-77ab8cddd738 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 600.597247] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7100a0a7-e767-43f7-a04e-ec7d8dd7ee58 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 600.606714] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6ca9efac-fcd5-4bd2-95e5-3a20b62b4242 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 600.624017] env[59369]: DEBUG nova.compute.provider_tree [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] Inventory has not changed in ProviderTree for provider: 6ace6145-3535-4f74-aa29-80f64a201369 {{(pid=59369) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 600.628254] env[59369]: DEBUG oslo_concurrency.lockutils [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 600.635520] env[59369]: DEBUG nova.scheduler.client.report [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] Inventory has not changed for provider 6ace6145-3535-4f74-aa29-80f64a201369 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 116, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59369) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 600.643666] env[59369]: DEBUG nova.virt.hardware [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-08-29T11:50:58Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-08-29T11:50:41Z,direct_url=,disk_format='vmdk',id=dc3bad91-06a4-485e-bdc4-b0135384e686,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='7386534ab8334806bb50f77e095e084c',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-08-29T11:50:42Z,virtual_size=,visibility=), allow threads: False {{(pid=59369) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 600.643896] env[59369]: DEBUG nova.virt.hardware [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] Flavor limits 0:0:0 {{(pid=59369) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 600.647516] env[59369]: DEBUG nova.virt.hardware [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] Image limits 0:0:0 {{(pid=59369) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 600.647781] env[59369]: DEBUG nova.virt.hardware [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] Flavor pref 0:0:0 {{(pid=59369) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 600.647931] env[59369]: DEBUG nova.virt.hardware [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] Image pref 0:0:0 {{(pid=59369) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 600.648088] env[59369]: DEBUG nova.virt.hardware [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59369) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 600.648304] env[59369]: DEBUG nova.virt.hardware [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59369) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 600.648456] env[59369]: DEBUG nova.virt.hardware [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59369) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 600.649143] env[59369]: DEBUG nova.virt.hardware [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] Got 1 possible topologies {{(pid=59369) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 600.649325] env[59369]: DEBUG nova.virt.hardware [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59369) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 600.649496] env[59369]: DEBUG nova.virt.hardware [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59369) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 600.653427] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78af4af6-6459-4e86-a693-51a3caa5ff41 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 600.671162] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e94636c7-173a-4fa8-b2c4-a5e5e6b01c19 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 600.674295] env[59369]: DEBUG oslo_concurrency.lockutils [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.326s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 600.674849] env[59369]: DEBUG nova.compute.manager [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] [instance: 31c17493-263e-4d08-b669-e901b07060d5] Start building networks asynchronously for instance. {{(pid=59369) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 600.678540] env[59369]: DEBUG oslo_concurrency.lockutils [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.050s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 600.681216] env[59369]: INFO nova.compute.claims [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 600.700714] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f1d733a8-d89b-4fed-8f29-49bea719212e {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 600.722343] env[59369]: DEBUG nova.compute.utils [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] Using /dev/sd instead of None {{(pid=59369) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 600.724244] env[59369]: DEBUG nova.compute.manager [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] [instance: 31c17493-263e-4d08-b669-e901b07060d5] Allocating IP information in the background. {{(pid=59369) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 600.724893] env[59369]: DEBUG nova.network.neutron [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] [instance: 31c17493-263e-4d08-b669-e901b07060d5] allocate_for_instance() {{(pid=59369) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 600.733037] env[59369]: DEBUG nova.compute.manager [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] [instance: 31c17493-263e-4d08-b669-e901b07060d5] Start building block device mappings for instance. {{(pid=59369) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 600.816065] env[59369]: DEBUG nova.compute.manager [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] [instance: 31c17493-263e-4d08-b669-e901b07060d5] Start spawning the instance on the hypervisor. {{(pid=59369) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 600.830206] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f99a8940-590b-4883-9a7c-02779271d885 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 600.843476] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e1750691-c7fb-4e24-8c5b-d5dcccaa2fda {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 600.849126] env[59369]: DEBUG nova.virt.hardware [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-08-29T11:50:58Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-08-29T11:50:41Z,direct_url=,disk_format='vmdk',id=dc3bad91-06a4-485e-bdc4-b0135384e686,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='7386534ab8334806bb50f77e095e084c',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-08-29T11:50:42Z,virtual_size=,visibility=), allow threads: False {{(pid=59369) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 600.849388] env[59369]: DEBUG nova.virt.hardware [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] Flavor limits 0:0:0 {{(pid=59369) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 600.849541] env[59369]: DEBUG nova.virt.hardware [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] Image limits 0:0:0 {{(pid=59369) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 600.849751] env[59369]: DEBUG nova.virt.hardware [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] Flavor pref 0:0:0 {{(pid=59369) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 600.849911] env[59369]: DEBUG nova.virt.hardware [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] Image pref 0:0:0 {{(pid=59369) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 600.850067] env[59369]: DEBUG nova.virt.hardware [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59369) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 600.850278] env[59369]: DEBUG nova.virt.hardware [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59369) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 600.850433] env[59369]: DEBUG nova.virt.hardware [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59369) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 600.850592] env[59369]: DEBUG nova.virt.hardware [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] Got 1 possible topologies {{(pid=59369) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 600.850749] env[59369]: DEBUG nova.virt.hardware [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59369) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 600.850916] env[59369]: DEBUG nova.virt.hardware [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59369) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 600.852204] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e4abf664-8720-43fa-93e2-8050ec0e6362 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 600.906109] env[59369]: DEBUG nova.policy [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4b83b35a6f564a6e92344cbcb6e9723d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7147c055c33c4dfaa1e9638fb3e99d49', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59369) authorize /opt/stack/nova/nova/policy.py:203}} [ 600.909164] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0094216b-f0b1-404b-bea9-faa713fddec8 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 600.918936] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d13829d-6760-40f8-a98c-bafeac50037a {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 600.923911] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f5af2b71-3e13-4554-9b87-7eab86c0c5ca {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 600.943448] env[59369]: DEBUG nova.compute.provider_tree [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] Inventory has not changed in ProviderTree for provider: 6ace6145-3535-4f74-aa29-80f64a201369 {{(pid=59369) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 600.962493] env[59369]: DEBUG nova.scheduler.client.report [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] Inventory has not changed for provider 6ace6145-3535-4f74-aa29-80f64a201369 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 116, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59369) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 600.984320] env[59369]: DEBUG oslo_concurrency.lockutils [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.306s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 600.985065] env[59369]: DEBUG nova.compute.manager [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] Start building networks asynchronously for instance. {{(pid=59369) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 601.015121] env[59369]: DEBUG nova.policy [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd221d60a657948d2b0a9be4e28e8cb1a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2eeec6feafe84002a6fc8c2050a1b3e4', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59369) authorize /opt/stack/nova/nova/policy.py:203}} [ 601.025033] env[59369]: DEBUG nova.compute.utils [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] Using /dev/sd instead of None {{(pid=59369) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 601.029439] env[59369]: DEBUG nova.compute.manager [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] Allocating IP information in the background. {{(pid=59369) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 601.029439] env[59369]: DEBUG nova.network.neutron [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] allocate_for_instance() {{(pid=59369) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 601.045306] env[59369]: DEBUG nova.compute.manager [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] Start building block device mappings for instance. {{(pid=59369) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 601.133494] env[59369]: DEBUG nova.compute.manager [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] Start spawning the instance on the hypervisor. {{(pid=59369) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 601.163495] env[59369]: DEBUG nova.virt.hardware [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-08-29T11:50:58Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-08-29T11:50:41Z,direct_url=,disk_format='vmdk',id=dc3bad91-06a4-485e-bdc4-b0135384e686,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='7386534ab8334806bb50f77e095e084c',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-08-29T11:50:42Z,virtual_size=,visibility=), allow threads: False {{(pid=59369) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 601.163745] env[59369]: DEBUG nova.virt.hardware [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] Flavor limits 0:0:0 {{(pid=59369) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 601.163995] env[59369]: DEBUG nova.virt.hardware [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] Image limits 0:0:0 {{(pid=59369) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 601.164379] env[59369]: DEBUG nova.virt.hardware [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] Flavor pref 0:0:0 {{(pid=59369) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 601.164445] env[59369]: DEBUG nova.virt.hardware [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] Image pref 0:0:0 {{(pid=59369) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 601.164648] env[59369]: DEBUG nova.virt.hardware [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59369) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 601.164720] env[59369]: DEBUG nova.virt.hardware [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59369) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 601.164872] env[59369]: DEBUG nova.virt.hardware [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59369) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 601.165036] env[59369]: DEBUG nova.virt.hardware [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] Got 1 possible topologies {{(pid=59369) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 601.165195] env[59369]: DEBUG nova.virt.hardware [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59369) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 601.165368] env[59369]: DEBUG nova.virt.hardware [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59369) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 601.166242] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f12e7a7f-ae86-493d-9564-b232d203a11c {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 601.175376] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c1136f3f-6aba-4ce3-bfae-3b40335fb61a {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 601.230978] env[59369]: DEBUG nova.policy [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2af6af018d44470aba1bcf563a25a9a1', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '22de4f4bea4140fe858257920cccb630', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59369) authorize /opt/stack/nova/nova/policy.py:203}} [ 601.905885] env[59369]: DEBUG nova.network.neutron [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] Successfully created port: a1ad8c18-3ecc-4f30-bd35-ed52c7085368 {{(pid=59369) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 601.944665] env[59369]: DEBUG nova.network.neutron [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] [instance: 31c17493-263e-4d08-b669-e901b07060d5] Successfully created port: bbe86fc8-b664-48e5-b16f-944d69cf7ecd {{(pid=59369) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 602.239353] env[59369]: DEBUG nova.network.neutron [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] Successfully created port: eddd35f6-cd0f-4e6a-8b7b-4d3b4b8ccb91 {{(pid=59369) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 602.915247] env[59369]: DEBUG oslo_concurrency.lockutils [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] Acquiring lock "eae87f89-8488-42e6-b065-1198bfbe8177" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 602.915618] env[59369]: DEBUG oslo_concurrency.lockutils [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] Lock "eae87f89-8488-42e6-b065-1198bfbe8177" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 602.936775] env[59369]: DEBUG nova.compute.manager [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] [instance: eae87f89-8488-42e6-b065-1198bfbe8177] Starting instance... {{(pid=59369) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 602.987789] env[59369]: DEBUG oslo_concurrency.lockutils [None req-d0b49fb8-e92b-4b35-a5c2-a3aed9b6a081 tempest-ServerDiagnosticsNegativeTest-1630934522 tempest-ServerDiagnosticsNegativeTest-1630934522-project-member] Acquiring lock "d501202a-354c-42e1-8480-f026d5216a58" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 602.987789] env[59369]: DEBUG oslo_concurrency.lockutils [None req-d0b49fb8-e92b-4b35-a5c2-a3aed9b6a081 tempest-ServerDiagnosticsNegativeTest-1630934522 tempest-ServerDiagnosticsNegativeTest-1630934522-project-member] Lock "d501202a-354c-42e1-8480-f026d5216a58" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 602.997268] env[59369]: DEBUG oslo_concurrency.lockutils [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 602.997924] env[59369]: DEBUG oslo_concurrency.lockutils [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 602.999956] env[59369]: INFO nova.compute.claims [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] [instance: eae87f89-8488-42e6-b065-1198bfbe8177] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 603.005353] env[59369]: DEBUG nova.compute.manager [None req-d0b49fb8-e92b-4b35-a5c2-a3aed9b6a081 tempest-ServerDiagnosticsNegativeTest-1630934522 tempest-ServerDiagnosticsNegativeTest-1630934522-project-member] [instance: d501202a-354c-42e1-8480-f026d5216a58] Starting instance... {{(pid=59369) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 603.092630] env[59369]: DEBUG oslo_concurrency.lockutils [None req-d0b49fb8-e92b-4b35-a5c2-a3aed9b6a081 tempest-ServerDiagnosticsNegativeTest-1630934522 tempest-ServerDiagnosticsNegativeTest-1630934522-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 603.183072] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-abd3b5c9-912e-4ef4-afaf-89099809370f {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 603.187669] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-52d3c135-ea4c-49d7-9cbf-86a739e60c6a {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 603.221026] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f59dc9da-ce3c-463e-9065-c51fee91c130 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 603.230819] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-587fe181-610d-412c-bc4d-59769ecea420 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 603.244133] env[59369]: DEBUG nova.compute.provider_tree [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] Inventory has not changed in ProviderTree for provider: 6ace6145-3535-4f74-aa29-80f64a201369 {{(pid=59369) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 603.255493] env[59369]: DEBUG nova.scheduler.client.report [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] Inventory has not changed for provider 6ace6145-3535-4f74-aa29-80f64a201369 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 116, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59369) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 603.273018] env[59369]: DEBUG oslo_concurrency.lockutils [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.274s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 603.273018] env[59369]: DEBUG nova.compute.manager [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] [instance: eae87f89-8488-42e6-b065-1198bfbe8177] Start building networks asynchronously for instance. {{(pid=59369) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 603.274639] env[59369]: DEBUG oslo_concurrency.lockutils [None req-d0b49fb8-e92b-4b35-a5c2-a3aed9b6a081 tempest-ServerDiagnosticsNegativeTest-1630934522 tempest-ServerDiagnosticsNegativeTest-1630934522-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.182s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 603.276200] env[59369]: INFO nova.compute.claims [None req-d0b49fb8-e92b-4b35-a5c2-a3aed9b6a081 tempest-ServerDiagnosticsNegativeTest-1630934522 tempest-ServerDiagnosticsNegativeTest-1630934522-project-member] [instance: d501202a-354c-42e1-8480-f026d5216a58] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 603.310016] env[59369]: DEBUG nova.compute.utils [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] Using /dev/sd instead of None {{(pid=59369) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 603.311269] env[59369]: DEBUG nova.compute.manager [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] [instance: eae87f89-8488-42e6-b065-1198bfbe8177] Allocating IP information in the background. {{(pid=59369) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 603.311269] env[59369]: DEBUG nova.network.neutron [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] [instance: eae87f89-8488-42e6-b065-1198bfbe8177] allocate_for_instance() {{(pid=59369) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 603.326871] env[59369]: DEBUG nova.compute.manager [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] [instance: eae87f89-8488-42e6-b065-1198bfbe8177] Start building block device mappings for instance. {{(pid=59369) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 603.425773] env[59369]: DEBUG nova.compute.manager [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] [instance: eae87f89-8488-42e6-b065-1198bfbe8177] Start spawning the instance on the hypervisor. {{(pid=59369) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 603.430959] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5d02ae9a-b32e-4211-8070-e03d749afc68 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 603.439579] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e2655820-f68e-4d53-8c32-2bfb195b681e {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 603.474258] env[59369]: DEBUG nova.virt.hardware [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-08-29T11:50:58Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-08-29T11:50:41Z,direct_url=,disk_format='vmdk',id=dc3bad91-06a4-485e-bdc4-b0135384e686,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='7386534ab8334806bb50f77e095e084c',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-08-29T11:50:42Z,virtual_size=,visibility=), allow threads: False {{(pid=59369) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 603.475127] env[59369]: DEBUG nova.virt.hardware [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] Flavor limits 0:0:0 {{(pid=59369) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 603.475127] env[59369]: DEBUG nova.virt.hardware [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] Image limits 0:0:0 {{(pid=59369) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 603.475127] env[59369]: DEBUG nova.virt.hardware [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] Flavor pref 0:0:0 {{(pid=59369) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 603.475127] env[59369]: DEBUG nova.virt.hardware [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] Image pref 0:0:0 {{(pid=59369) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 603.475127] env[59369]: DEBUG nova.virt.hardware [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59369) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 603.475612] env[59369]: DEBUG nova.virt.hardware [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59369) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 603.475612] env[59369]: DEBUG nova.virt.hardware [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59369) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 603.475612] env[59369]: DEBUG nova.virt.hardware [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] Got 1 possible topologies {{(pid=59369) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 603.475752] env[59369]: DEBUG nova.virt.hardware [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59369) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 603.476054] env[59369]: DEBUG nova.virt.hardware [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59369) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 603.476778] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9fe4a5a0-a782-4d80-85a7-c5f0dc3bcdb3 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 603.479904] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-692e3154-97c2-4ceb-a223-de5babb71d16 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 603.490583] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fd53cbbf-8829-44b9-a95f-39c9c8ed9100 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 603.495702] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ed750e74-87c5-4585-ad08-037487f27f2f {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 603.510430] env[59369]: DEBUG nova.compute.provider_tree [None req-d0b49fb8-e92b-4b35-a5c2-a3aed9b6a081 tempest-ServerDiagnosticsNegativeTest-1630934522 tempest-ServerDiagnosticsNegativeTest-1630934522-project-member] Inventory has not changed in ProviderTree for provider: 6ace6145-3535-4f74-aa29-80f64a201369 {{(pid=59369) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 603.528221] env[59369]: DEBUG nova.scheduler.client.report [None req-d0b49fb8-e92b-4b35-a5c2-a3aed9b6a081 tempest-ServerDiagnosticsNegativeTest-1630934522 tempest-ServerDiagnosticsNegativeTest-1630934522-project-member] Inventory has not changed for provider 6ace6145-3535-4f74-aa29-80f64a201369 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 116, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59369) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 603.551250] env[59369]: DEBUG nova.policy [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f8d37ddac24746b195811cc2ae491aae', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8b65b9c5302146b987bbec80e1b803bb', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59369) authorize /opt/stack/nova/nova/policy.py:203}} [ 603.554650] env[59369]: DEBUG oslo_concurrency.lockutils [None req-d0b49fb8-e92b-4b35-a5c2-a3aed9b6a081 tempest-ServerDiagnosticsNegativeTest-1630934522 tempest-ServerDiagnosticsNegativeTest-1630934522-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.280s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 603.555123] env[59369]: DEBUG nova.compute.manager [None req-d0b49fb8-e92b-4b35-a5c2-a3aed9b6a081 tempest-ServerDiagnosticsNegativeTest-1630934522 tempest-ServerDiagnosticsNegativeTest-1630934522-project-member] [instance: d501202a-354c-42e1-8480-f026d5216a58] Start building networks asynchronously for instance. {{(pid=59369) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 603.599633] env[59369]: DEBUG nova.compute.utils [None req-d0b49fb8-e92b-4b35-a5c2-a3aed9b6a081 tempest-ServerDiagnosticsNegativeTest-1630934522 tempest-ServerDiagnosticsNegativeTest-1630934522-project-member] Using /dev/sd instead of None {{(pid=59369) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 603.601039] env[59369]: DEBUG nova.compute.manager [None req-d0b49fb8-e92b-4b35-a5c2-a3aed9b6a081 tempest-ServerDiagnosticsNegativeTest-1630934522 tempest-ServerDiagnosticsNegativeTest-1630934522-project-member] [instance: d501202a-354c-42e1-8480-f026d5216a58] Allocating IP information in the background. {{(pid=59369) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 603.601250] env[59369]: DEBUG nova.network.neutron [None req-d0b49fb8-e92b-4b35-a5c2-a3aed9b6a081 tempest-ServerDiagnosticsNegativeTest-1630934522 tempest-ServerDiagnosticsNegativeTest-1630934522-project-member] [instance: d501202a-354c-42e1-8480-f026d5216a58] allocate_for_instance() {{(pid=59369) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 603.614940] env[59369]: DEBUG nova.compute.manager [None req-d0b49fb8-e92b-4b35-a5c2-a3aed9b6a081 tempest-ServerDiagnosticsNegativeTest-1630934522 tempest-ServerDiagnosticsNegativeTest-1630934522-project-member] [instance: d501202a-354c-42e1-8480-f026d5216a58] Start building block device mappings for instance. {{(pid=59369) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 603.699833] env[59369]: DEBUG nova.compute.manager [None req-d0b49fb8-e92b-4b35-a5c2-a3aed9b6a081 tempest-ServerDiagnosticsNegativeTest-1630934522 tempest-ServerDiagnosticsNegativeTest-1630934522-project-member] [instance: d501202a-354c-42e1-8480-f026d5216a58] Start spawning the instance on the hypervisor. {{(pid=59369) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 603.739742] env[59369]: DEBUG nova.virt.hardware [None req-d0b49fb8-e92b-4b35-a5c2-a3aed9b6a081 tempest-ServerDiagnosticsNegativeTest-1630934522 tempest-ServerDiagnosticsNegativeTest-1630934522-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-08-29T11:50:58Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-08-29T11:50:41Z,direct_url=,disk_format='vmdk',id=dc3bad91-06a4-485e-bdc4-b0135384e686,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='7386534ab8334806bb50f77e095e084c',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-08-29T11:50:42Z,virtual_size=,visibility=), allow threads: False {{(pid=59369) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 603.739968] env[59369]: DEBUG nova.virt.hardware [None req-d0b49fb8-e92b-4b35-a5c2-a3aed9b6a081 tempest-ServerDiagnosticsNegativeTest-1630934522 tempest-ServerDiagnosticsNegativeTest-1630934522-project-member] Flavor limits 0:0:0 {{(pid=59369) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 603.740128] env[59369]: DEBUG nova.virt.hardware [None req-d0b49fb8-e92b-4b35-a5c2-a3aed9b6a081 tempest-ServerDiagnosticsNegativeTest-1630934522 tempest-ServerDiagnosticsNegativeTest-1630934522-project-member] Image limits 0:0:0 {{(pid=59369) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 603.740299] env[59369]: DEBUG nova.virt.hardware [None req-d0b49fb8-e92b-4b35-a5c2-a3aed9b6a081 tempest-ServerDiagnosticsNegativeTest-1630934522 tempest-ServerDiagnosticsNegativeTest-1630934522-project-member] Flavor pref 0:0:0 {{(pid=59369) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 603.740447] env[59369]: DEBUG nova.virt.hardware [None req-d0b49fb8-e92b-4b35-a5c2-a3aed9b6a081 tempest-ServerDiagnosticsNegativeTest-1630934522 tempest-ServerDiagnosticsNegativeTest-1630934522-project-member] Image pref 0:0:0 {{(pid=59369) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 603.740584] env[59369]: DEBUG nova.virt.hardware [None req-d0b49fb8-e92b-4b35-a5c2-a3aed9b6a081 tempest-ServerDiagnosticsNegativeTest-1630934522 tempest-ServerDiagnosticsNegativeTest-1630934522-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59369) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 603.741085] env[59369]: DEBUG nova.virt.hardware [None req-d0b49fb8-e92b-4b35-a5c2-a3aed9b6a081 tempest-ServerDiagnosticsNegativeTest-1630934522 tempest-ServerDiagnosticsNegativeTest-1630934522-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59369) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 603.741286] env[59369]: DEBUG nova.virt.hardware [None req-d0b49fb8-e92b-4b35-a5c2-a3aed9b6a081 tempest-ServerDiagnosticsNegativeTest-1630934522 tempest-ServerDiagnosticsNegativeTest-1630934522-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59369) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 603.741451] env[59369]: DEBUG nova.virt.hardware [None req-d0b49fb8-e92b-4b35-a5c2-a3aed9b6a081 tempest-ServerDiagnosticsNegativeTest-1630934522 tempest-ServerDiagnosticsNegativeTest-1630934522-project-member] Got 1 possible topologies {{(pid=59369) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 603.741646] env[59369]: DEBUG nova.virt.hardware [None req-d0b49fb8-e92b-4b35-a5c2-a3aed9b6a081 tempest-ServerDiagnosticsNegativeTest-1630934522 tempest-ServerDiagnosticsNegativeTest-1630934522-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59369) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 603.741830] env[59369]: DEBUG nova.virt.hardware [None req-d0b49fb8-e92b-4b35-a5c2-a3aed9b6a081 tempest-ServerDiagnosticsNegativeTest-1630934522 tempest-ServerDiagnosticsNegativeTest-1630934522-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59369) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 603.742811] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6534aee1-65ab-422c-9dd6-8b34af071489 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 603.752389] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e0c169d4-db31-451d-accd-a00addf8013b {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 604.002927] env[59369]: DEBUG nova.policy [None req-d0b49fb8-e92b-4b35-a5c2-a3aed9b6a081 tempest-ServerDiagnosticsNegativeTest-1630934522 tempest-ServerDiagnosticsNegativeTest-1630934522-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'bbd9bbbbcf914b64865b4bedb3ccac03', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e99a6ff60020444f9ec68d18f043c64f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59369) authorize /opt/stack/nova/nova/policy.py:203}} [ 604.606949] env[59369]: DEBUG nova.network.neutron [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] [instance: 31c17493-263e-4d08-b669-e901b07060d5] Successfully updated port: bbe86fc8-b664-48e5-b16f-944d69cf7ecd {{(pid=59369) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 604.630140] env[59369]: DEBUG oslo_concurrency.lockutils [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] Acquiring lock "refresh_cache-31c17493-263e-4d08-b669-e901b07060d5" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 604.630140] env[59369]: DEBUG oslo_concurrency.lockutils [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] Acquired lock "refresh_cache-31c17493-263e-4d08-b669-e901b07060d5" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 604.630140] env[59369]: DEBUG nova.network.neutron [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] [instance: 31c17493-263e-4d08-b669-e901b07060d5] Building network info cache for instance {{(pid=59369) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 604.697201] env[59369]: DEBUG nova.network.neutron [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] Successfully updated port: eddd35f6-cd0f-4e6a-8b7b-4d3b4b8ccb91 {{(pid=59369) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 604.709758] env[59369]: DEBUG nova.network.neutron [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] Successfully updated port: a1ad8c18-3ecc-4f30-bd35-ed52c7085368 {{(pid=59369) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 604.716247] env[59369]: DEBUG oslo_concurrency.lockutils [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] Acquiring lock "refresh_cache-b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 604.716247] env[59369]: DEBUG oslo_concurrency.lockutils [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] Acquired lock "refresh_cache-b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 604.716247] env[59369]: DEBUG nova.network.neutron [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] Building network info cache for instance {{(pid=59369) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 604.731170] env[59369]: DEBUG oslo_concurrency.lockutils [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] Acquiring lock "refresh_cache-02929ad5-b2f4-4a80-a606-2a0c9b6222e4" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 604.731170] env[59369]: DEBUG oslo_concurrency.lockutils [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] Acquired lock "refresh_cache-02929ad5-b2f4-4a80-a606-2a0c9b6222e4" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 604.731885] env[59369]: DEBUG nova.network.neutron [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] Building network info cache for instance {{(pid=59369) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 604.777192] env[59369]: DEBUG nova.network.neutron [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] [instance: 31c17493-263e-4d08-b669-e901b07060d5] Instance cache missing network info. {{(pid=59369) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 604.910845] env[59369]: DEBUG nova.network.neutron [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] Instance cache missing network info. {{(pid=59369) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 605.002568] env[59369]: DEBUG nova.network.neutron [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] Instance cache missing network info. {{(pid=59369) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 605.335944] env[59369]: DEBUG nova.network.neutron [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] [instance: 31c17493-263e-4d08-b669-e901b07060d5] Updating instance_info_cache with network_info: [{"id": "bbe86fc8-b664-48e5-b16f-944d69cf7ecd", "address": "fa:16:3e:16:e3:63", "network": {"id": "5bf3b46a-5cae-491d-ab4e-f9774d11c1d8", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-332625799-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "2eeec6feafe84002a6fc8c2050a1b3e4", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "209639b9-c313-4b35-86dc-dccd744d174a", "external-id": "nsx-vlan-transportzone-868", "segmentation_id": 868, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapbbe86fc8-b6", "ovs_interfaceid": "bbe86fc8-b664-48e5-b16f-944d69cf7ecd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59369) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 605.351388] env[59369]: DEBUG oslo_concurrency.lockutils [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] Releasing lock "refresh_cache-31c17493-263e-4d08-b669-e901b07060d5" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 605.351388] env[59369]: DEBUG nova.compute.manager [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] [instance: 31c17493-263e-4d08-b669-e901b07060d5] Instance network_info: |[{"id": "bbe86fc8-b664-48e5-b16f-944d69cf7ecd", "address": "fa:16:3e:16:e3:63", "network": {"id": "5bf3b46a-5cae-491d-ab4e-f9774d11c1d8", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-332625799-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "2eeec6feafe84002a6fc8c2050a1b3e4", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "209639b9-c313-4b35-86dc-dccd744d174a", "external-id": "nsx-vlan-transportzone-868", "segmentation_id": 868, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapbbe86fc8-b6", "ovs_interfaceid": "bbe86fc8-b664-48e5-b16f-944d69cf7ecd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59369) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 605.351919] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] [instance: 31c17493-263e-4d08-b669-e901b07060d5] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:16:e3:63', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '209639b9-c313-4b35-86dc-dccd744d174a', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'bbe86fc8-b664-48e5-b16f-944d69cf7ecd', 'vif_model': 'vmxnet3'}] {{(pid=59369) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 605.365119] env[59369]: DEBUG nova.virt.vmwareapi.vm_util [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] Creating folder: OpenStack. Parent ref: group-v4. {{(pid=59369) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 605.366475] env[59369]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-06bec221-9ed1-4e42-afe7-8556b145d3d0 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 605.379165] env[59369]: INFO nova.virt.vmwareapi.vm_util [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] Created folder: OpenStack in parent group-v4. [ 605.379349] env[59369]: DEBUG nova.virt.vmwareapi.vm_util [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] Creating folder: Project (2eeec6feafe84002a6fc8c2050a1b3e4). Parent ref: group-v121837. {{(pid=59369) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 605.379590] env[59369]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-cb67b88c-50e2-495a-a751-b6d8e4ceb9ba {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 605.389702] env[59369]: INFO nova.virt.vmwareapi.vm_util [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] Created folder: Project (2eeec6feafe84002a6fc8c2050a1b3e4) in parent group-v121837. [ 605.389882] env[59369]: DEBUG nova.virt.vmwareapi.vm_util [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] Creating folder: Instances. Parent ref: group-v121838. {{(pid=59369) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 605.390151] env[59369]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-166d80d7-7df8-429d-8ef5-1cce9fef2ded {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 605.399167] env[59369]: INFO nova.virt.vmwareapi.vm_util [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] Created folder: Instances in parent group-v121838. [ 605.399422] env[59369]: DEBUG oslo.service.loopingcall [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59369) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 605.399610] env[59369]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 31c17493-263e-4d08-b669-e901b07060d5] Creating VM on the ESX host {{(pid=59369) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 605.399796] env[59369]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-56bce6a8-7431-49af-ae59-e5e2afe81a11 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 605.421776] env[59369]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 605.421776] env[59369]: value = "task-463223" [ 605.421776] env[59369]: _type = "Task" [ 605.421776] env[59369]: } to complete. {{(pid=59369) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 605.435382] env[59369]: DEBUG oslo_vmware.api [-] Task: {'id': task-463223, 'name': CreateVM_Task} progress is 0%. {{(pid=59369) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 605.819020] env[59369]: DEBUG nova.network.neutron [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] [instance: eae87f89-8488-42e6-b065-1198bfbe8177] Successfully created port: 6c523737-a1dc-4be3-a0b1-46b4186741ff {{(pid=59369) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 605.936688] env[59369]: DEBUG oslo_vmware.api [-] Task: {'id': task-463223, 'name': CreateVM_Task, 'duration_secs': 0.298598} completed successfully. {{(pid=59369) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 605.936845] env[59369]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 31c17493-263e-4d08-b669-e901b07060d5] Created VM on the ESX host {{(pid=59369) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 605.965665] env[59369]: DEBUG oslo_vmware.service [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-99537d5a-d6bb-47e8-b375-f43539bed483 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 605.977132] env[59369]: DEBUG oslo_concurrency.lockutils [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/dc3bad91-06a4-485e-bdc4-b0135384e686" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 605.977132] env[59369]: DEBUG oslo_concurrency.lockutils [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] Acquired lock "[datastore1] devstack-image-cache_base/dc3bad91-06a4-485e-bdc4-b0135384e686" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 605.977774] env[59369]: DEBUG oslo_concurrency.lockutils [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/dc3bad91-06a4-485e-bdc4-b0135384e686" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 605.980019] env[59369]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-abf4f2d0-3783-42e4-86a3-78404d907cd9 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 605.984367] env[59369]: DEBUG oslo_vmware.api [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] Waiting for the task: (returnval){ [ 605.984367] env[59369]: value = "session[52794835-9974-5e05-43d4-e917bd2adbc9]52f8636b-092a-8b20-d1a3-e3437ed1522b" [ 605.984367] env[59369]: _type = "Task" [ 605.984367] env[59369]: } to complete. {{(pid=59369) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 605.994904] env[59369]: DEBUG oslo_vmware.api [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] Task: {'id': session[52794835-9974-5e05-43d4-e917bd2adbc9]52f8636b-092a-8b20-d1a3-e3437ed1522b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59369) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 606.148770] env[59369]: DEBUG nova.network.neutron [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] Updating instance_info_cache with network_info: [{"id": "eddd35f6-cd0f-4e6a-8b7b-4d3b4b8ccb91", "address": "fa:16:3e:37:ad:42", "network": {"id": "b9eff4e8-f1ca-450e-848a-97a956388f68", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.231", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "7386534ab8334806bb50f77e095e084c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ad72c645-a67d-4efd-b563-28e44077e68d", "external-id": "nsx-vlan-transportzone-201", "segmentation_id": 201, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapeddd35f6-cd", "ovs_interfaceid": "eddd35f6-cd0f-4e6a-8b7b-4d3b4b8ccb91", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59369) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 606.163254] env[59369]: DEBUG nova.network.neutron [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] Updating instance_info_cache with network_info: [{"id": "a1ad8c18-3ecc-4f30-bd35-ed52c7085368", "address": "fa:16:3e:2c:95:75", "network": {"id": "b9eff4e8-f1ca-450e-848a-97a956388f68", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.235", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "7386534ab8334806bb50f77e095e084c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ad72c645-a67d-4efd-b563-28e44077e68d", "external-id": "nsx-vlan-transportzone-201", "segmentation_id": 201, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa1ad8c18-3e", "ovs_interfaceid": "a1ad8c18-3ecc-4f30-bd35-ed52c7085368", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59369) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 606.174015] env[59369]: DEBUG oslo_concurrency.lockutils [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] Releasing lock "refresh_cache-b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 606.174015] env[59369]: DEBUG nova.compute.manager [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] Instance network_info: |[{"id": "eddd35f6-cd0f-4e6a-8b7b-4d3b4b8ccb91", "address": "fa:16:3e:37:ad:42", "network": {"id": "b9eff4e8-f1ca-450e-848a-97a956388f68", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.231", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "7386534ab8334806bb50f77e095e084c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ad72c645-a67d-4efd-b563-28e44077e68d", "external-id": "nsx-vlan-transportzone-201", "segmentation_id": 201, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapeddd35f6-cd", "ovs_interfaceid": "eddd35f6-cd0f-4e6a-8b7b-4d3b4b8ccb91", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59369) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 606.174219] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:37:ad:42', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ad72c645-a67d-4efd-b563-28e44077e68d', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'eddd35f6-cd0f-4e6a-8b7b-4d3b4b8ccb91', 'vif_model': 'vmxnet3'}] {{(pid=59369) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 606.183886] env[59369]: DEBUG nova.virt.vmwareapi.vm_util [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] Creating folder: Project (22de4f4bea4140fe858257920cccb630). Parent ref: group-v121837. {{(pid=59369) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 606.183886] env[59369]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-159a5a31-5c27-4392-96ab-e7b7aa877025 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.184952] env[59369]: DEBUG oslo_concurrency.lockutils [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] Releasing lock "refresh_cache-02929ad5-b2f4-4a80-a606-2a0c9b6222e4" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 606.184952] env[59369]: DEBUG nova.compute.manager [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] Instance network_info: |[{"id": "a1ad8c18-3ecc-4f30-bd35-ed52c7085368", "address": "fa:16:3e:2c:95:75", "network": {"id": "b9eff4e8-f1ca-450e-848a-97a956388f68", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.235", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "7386534ab8334806bb50f77e095e084c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ad72c645-a67d-4efd-b563-28e44077e68d", "external-id": "nsx-vlan-transportzone-201", "segmentation_id": 201, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa1ad8c18-3e", "ovs_interfaceid": "a1ad8c18-3ecc-4f30-bd35-ed52c7085368", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59369) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 606.185743] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:2c:95:75', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ad72c645-a67d-4efd-b563-28e44077e68d', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'a1ad8c18-3ecc-4f30-bd35-ed52c7085368', 'vif_model': 'vmxnet3'}] {{(pid=59369) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 606.192746] env[59369]: DEBUG nova.virt.vmwareapi.vm_util [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] Creating folder: Project (7147c055c33c4dfaa1e9638fb3e99d49). Parent ref: group-v121837. {{(pid=59369) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 606.193685] env[59369]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f35abe8e-5891-455d-915a-7dc58c4b823c {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.205985] env[59369]: INFO nova.virt.vmwareapi.vm_util [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] Created folder: Project (22de4f4bea4140fe858257920cccb630) in parent group-v121837. [ 606.205985] env[59369]: DEBUG nova.virt.vmwareapi.vm_util [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] Creating folder: Instances. Parent ref: group-v121841. {{(pid=59369) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 606.205985] env[59369]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-57a08531-61bc-4fe6-970a-84612942582b {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.207789] env[59369]: INFO nova.virt.vmwareapi.vm_util [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] Created folder: Project (7147c055c33c4dfaa1e9638fb3e99d49) in parent group-v121837. [ 606.208166] env[59369]: DEBUG nova.virt.vmwareapi.vm_util [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] Creating folder: Instances. Parent ref: group-v121842. {{(pid=59369) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 606.208510] env[59369]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-6ecc4c07-5d01-4324-a7be-ca8ddb87bab0 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.217552] env[59369]: INFO nova.virt.vmwareapi.vm_util [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] Created folder: Instances in parent group-v121842. [ 606.219569] env[59369]: DEBUG oslo.service.loopingcall [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59369) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 606.219569] env[59369]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] Creating VM on the ESX host {{(pid=59369) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 606.219569] env[59369]: INFO nova.virt.vmwareapi.vm_util [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] Created folder: Instances in parent group-v121841. [ 606.219569] env[59369]: DEBUG oslo.service.loopingcall [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59369) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 606.219569] env[59369]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-cb57c971-6982-4057-959e-fba0fb1840c0 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.234682] env[59369]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] Creating VM on the ESX host {{(pid=59369) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 606.235343] env[59369]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-bbec76ed-c57c-4f57-8258-b4214772c826 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.256218] env[59369]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 606.256218] env[59369]: value = "task-463229" [ 606.256218] env[59369]: _type = "Task" [ 606.256218] env[59369]: } to complete. {{(pid=59369) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 606.258106] env[59369]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 606.258106] env[59369]: value = "task-463228" [ 606.258106] env[59369]: _type = "Task" [ 606.258106] env[59369]: } to complete. {{(pid=59369) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 606.271750] env[59369]: DEBUG oslo_vmware.api [-] Task: {'id': task-463229, 'name': CreateVM_Task} progress is 0%. {{(pid=59369) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 606.271947] env[59369]: DEBUG oslo_vmware.api [-] Task: {'id': task-463228, 'name': CreateVM_Task} progress is 0%. {{(pid=59369) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 606.502168] env[59369]: DEBUG oslo_concurrency.lockutils [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] Releasing lock "[datastore1] devstack-image-cache_base/dc3bad91-06a4-485e-bdc4-b0135384e686" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 606.502168] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] [instance: 31c17493-263e-4d08-b669-e901b07060d5] Processing image dc3bad91-06a4-485e-bdc4-b0135384e686 {{(pid=59369) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 606.502168] env[59369]: DEBUG oslo_concurrency.lockutils [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/dc3bad91-06a4-485e-bdc4-b0135384e686/dc3bad91-06a4-485e-bdc4-b0135384e686.vmdk" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 606.502168] env[59369]: DEBUG oslo_concurrency.lockutils [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] Acquired lock "[datastore1] devstack-image-cache_base/dc3bad91-06a4-485e-bdc4-b0135384e686/dc3bad91-06a4-485e-bdc4-b0135384e686.vmdk" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 606.505287] env[59369]: DEBUG nova.virt.vmwareapi.ds_util [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59369) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 606.505287] env[59369]: DEBUG nova.network.neutron [None req-d0b49fb8-e92b-4b35-a5c2-a3aed9b6a081 tempest-ServerDiagnosticsNegativeTest-1630934522 tempest-ServerDiagnosticsNegativeTest-1630934522-project-member] [instance: d501202a-354c-42e1-8480-f026d5216a58] Successfully created port: 92977e32-ac33-400b-a6d8-cd7726581aff {{(pid=59369) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 606.506400] env[59369]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-8f772c47-1de0-424f-b093-39f47fd28bdb {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.515691] env[59369]: DEBUG nova.virt.vmwareapi.ds_util [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59369) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 606.515988] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59369) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 606.516913] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e3a04562-709c-483a-a1c9-19fcbd62c2e7 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.527751] env[59369]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ac371f71-62de-4c94-810c-36ddd218da52 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.533590] env[59369]: DEBUG oslo_vmware.api [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] Waiting for the task: (returnval){ [ 606.533590] env[59369]: value = "session[52794835-9974-5e05-43d4-e917bd2adbc9]529bedf1-a622-c6ad-fb5f-f7af16f0a678" [ 606.533590] env[59369]: _type = "Task" [ 606.533590] env[59369]: } to complete. {{(pid=59369) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 606.544952] env[59369]: DEBUG oslo_vmware.api [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] Task: {'id': session[52794835-9974-5e05-43d4-e917bd2adbc9]529bedf1-a622-c6ad-fb5f-f7af16f0a678, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59369) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 606.774171] env[59369]: DEBUG oslo_vmware.api [-] Task: {'id': task-463229, 'name': CreateVM_Task, 'duration_secs': 0.387551} completed successfully. {{(pid=59369) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 606.779290] env[59369]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] Created VM on the ESX host {{(pid=59369) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 606.779551] env[59369]: DEBUG oslo_vmware.api [-] Task: {'id': task-463228, 'name': CreateVM_Task, 'duration_secs': 0.385199} completed successfully. {{(pid=59369) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 606.780239] env[59369]: DEBUG oslo_concurrency.lockutils [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/dc3bad91-06a4-485e-bdc4-b0135384e686" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 606.784019] env[59369]: DEBUG oslo_concurrency.lockutils [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] Acquired lock "[datastore1] devstack-image-cache_base/dc3bad91-06a4-485e-bdc4-b0135384e686" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 606.784019] env[59369]: DEBUG oslo_concurrency.lockutils [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/dc3bad91-06a4-485e-bdc4-b0135384e686" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 606.784019] env[59369]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] Created VM on the ESX host {{(pid=59369) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 606.784019] env[59369]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-f171b8e5-de19-4ca4-9219-27d16aae1686 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.785268] env[59369]: DEBUG oslo_concurrency.lockutils [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/dc3bad91-06a4-485e-bdc4-b0135384e686" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 606.790997] env[59369]: DEBUG oslo_vmware.api [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] Waiting for the task: (returnval){ [ 606.790997] env[59369]: value = "session[52794835-9974-5e05-43d4-e917bd2adbc9]5224d192-efbe-356d-42f9-d0b4260e4587" [ 606.790997] env[59369]: _type = "Task" [ 606.790997] env[59369]: } to complete. {{(pid=59369) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 606.805858] env[59369]: DEBUG oslo_concurrency.lockutils [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] Releasing lock "[datastore1] devstack-image-cache_base/dc3bad91-06a4-485e-bdc4-b0135384e686" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 606.806165] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] Processing image dc3bad91-06a4-485e-bdc4-b0135384e686 {{(pid=59369) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 606.806407] env[59369]: DEBUG oslo_concurrency.lockutils [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/dc3bad91-06a4-485e-bdc4-b0135384e686/dc3bad91-06a4-485e-bdc4-b0135384e686.vmdk" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 606.806634] env[59369]: DEBUG oslo_concurrency.lockutils [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] Acquired lock "[datastore1] devstack-image-cache_base/dc3bad91-06a4-485e-bdc4-b0135384e686" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 606.806986] env[59369]: DEBUG oslo_concurrency.lockutils [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/dc3bad91-06a4-485e-bdc4-b0135384e686" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 606.807278] env[59369]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-75a6c668-3b71-48bc-9b74-f880b2c9f492 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.813592] env[59369]: DEBUG oslo_vmware.api [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] Waiting for the task: (returnval){ [ 606.813592] env[59369]: value = "session[52794835-9974-5e05-43d4-e917bd2adbc9]52509e28-f146-5a21-c825-c2a4e041cf62" [ 606.813592] env[59369]: _type = "Task" [ 606.813592] env[59369]: } to complete. {{(pid=59369) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 606.822189] env[59369]: DEBUG oslo_vmware.api [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] Task: {'id': session[52794835-9974-5e05-43d4-e917bd2adbc9]52509e28-f146-5a21-c825-c2a4e041cf62, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59369) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 607.048934] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] [instance: 31c17493-263e-4d08-b669-e901b07060d5] Preparing fetch location {{(pid=59369) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 607.049216] env[59369]: DEBUG nova.virt.vmwareapi.ds_util [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] Creating directory with path [datastore1] vmware_temp/ce7bae6e-4193-4ad2-8925-c9c0abe682fa/dc3bad91-06a4-485e-bdc4-b0135384e686 {{(pid=59369) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 607.049452] env[59369]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-0731f1e3-52e4-4fd4-b045-02eaed82c671 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 607.075956] env[59369]: DEBUG nova.virt.vmwareapi.ds_util [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] Created directory with path [datastore1] vmware_temp/ce7bae6e-4193-4ad2-8925-c9c0abe682fa/dc3bad91-06a4-485e-bdc4-b0135384e686 {{(pid=59369) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 607.075956] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] [instance: 31c17493-263e-4d08-b669-e901b07060d5] Fetch image to [datastore1] vmware_temp/ce7bae6e-4193-4ad2-8925-c9c0abe682fa/dc3bad91-06a4-485e-bdc4-b0135384e686/tmp-sparse.vmdk {{(pid=59369) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 607.075956] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] [instance: 31c17493-263e-4d08-b669-e901b07060d5] Downloading image file data dc3bad91-06a4-485e-bdc4-b0135384e686 to [datastore1] vmware_temp/ce7bae6e-4193-4ad2-8925-c9c0abe682fa/dc3bad91-06a4-485e-bdc4-b0135384e686/tmp-sparse.vmdk on the data store datastore1 {{(pid=59369) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 607.076352] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b17b4218-7620-49d3-9dee-5bb873f4ae94 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 607.086335] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8b1b71a5-440d-4d6c-a290-ce0a02b1820c {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 607.097087] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3063a1d8-3678-4e1c-a41c-ae6adecd1dae {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 607.138170] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db67ab3b-096f-4861-beb2-ef7fbae43c76 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 607.146616] env[59369]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-413e484b-c912-47d6-8030-49bf7bf0ce1b {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 607.177132] env[59369]: DEBUG nova.virt.vmwareapi.images [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] [instance: 31c17493-263e-4d08-b669-e901b07060d5] Downloading image file data dc3bad91-06a4-485e-bdc4-b0135384e686 to the data store datastore1 {{(pid=59369) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 607.260802] env[59369]: DEBUG oslo_vmware.rw_handles [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ce7bae6e-4193-4ad2-8925-c9c0abe682fa/dc3bad91-06a4-485e-bdc4-b0135384e686/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59369) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 607.339797] env[59369]: DEBUG oslo_vmware.rw_handles [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] Completed reading data from the image iterator. {{(pid=59369) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 607.339797] env[59369]: DEBUG oslo_vmware.rw_handles [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ce7bae6e-4193-4ad2-8925-c9c0abe682fa/dc3bad91-06a4-485e-bdc4-b0135384e686/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59369) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 607.340110] env[59369]: DEBUG oslo_concurrency.lockutils [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] Releasing lock "[datastore1] devstack-image-cache_base/dc3bad91-06a4-485e-bdc4-b0135384e686" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 607.341165] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] Processing image dc3bad91-06a4-485e-bdc4-b0135384e686 {{(pid=59369) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 607.341165] env[59369]: DEBUG oslo_concurrency.lockutils [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/dc3bad91-06a4-485e-bdc4-b0135384e686/dc3bad91-06a4-485e-bdc4-b0135384e686.vmdk" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 607.969355] env[59369]: DEBUG nova.compute.manager [req-fc646f38-a849-4e1b-b23e-a0228f75ce0f req-61f654af-f9b5-4b54-af77-9056134b1ee6 service nova] [instance: 31c17493-263e-4d08-b669-e901b07060d5] Received event network-vif-plugged-bbe86fc8-b664-48e5-b16f-944d69cf7ecd {{(pid=59369) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 607.969709] env[59369]: DEBUG oslo_concurrency.lockutils [req-fc646f38-a849-4e1b-b23e-a0228f75ce0f req-61f654af-f9b5-4b54-af77-9056134b1ee6 service nova] Acquiring lock "31c17493-263e-4d08-b669-e901b07060d5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 607.969808] env[59369]: DEBUG oslo_concurrency.lockutils [req-fc646f38-a849-4e1b-b23e-a0228f75ce0f req-61f654af-f9b5-4b54-af77-9056134b1ee6 service nova] Lock "31c17493-263e-4d08-b669-e901b07060d5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 607.969991] env[59369]: DEBUG oslo_concurrency.lockutils [req-fc646f38-a849-4e1b-b23e-a0228f75ce0f req-61f654af-f9b5-4b54-af77-9056134b1ee6 service nova] Lock "31c17493-263e-4d08-b669-e901b07060d5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 607.970102] env[59369]: DEBUG nova.compute.manager [req-fc646f38-a849-4e1b-b23e-a0228f75ce0f req-61f654af-f9b5-4b54-af77-9056134b1ee6 service nova] [instance: 31c17493-263e-4d08-b669-e901b07060d5] No waiting events found dispatching network-vif-plugged-bbe86fc8-b664-48e5-b16f-944d69cf7ecd {{(pid=59369) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 607.970261] env[59369]: WARNING nova.compute.manager [req-fc646f38-a849-4e1b-b23e-a0228f75ce0f req-61f654af-f9b5-4b54-af77-9056134b1ee6 service nova] [instance: 31c17493-263e-4d08-b669-e901b07060d5] Received unexpected event network-vif-plugged-bbe86fc8-b664-48e5-b16f-944d69cf7ecd for instance with vm_state building and task_state spawning. [ 608.243961] env[59369]: DEBUG nova.compute.manager [req-edd15205-9a38-4a13-84e8-67d075a98a99 req-e850deb8-449d-4d4b-ae78-320af720d5ed service nova] [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] Received event network-vif-plugged-a1ad8c18-3ecc-4f30-bd35-ed52c7085368 {{(pid=59369) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 608.243961] env[59369]: DEBUG oslo_concurrency.lockutils [req-edd15205-9a38-4a13-84e8-67d075a98a99 req-e850deb8-449d-4d4b-ae78-320af720d5ed service nova] Acquiring lock "02929ad5-b2f4-4a80-a606-2a0c9b6222e4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 608.243961] env[59369]: DEBUG oslo_concurrency.lockutils [req-edd15205-9a38-4a13-84e8-67d075a98a99 req-e850deb8-449d-4d4b-ae78-320af720d5ed service nova] Lock "02929ad5-b2f4-4a80-a606-2a0c9b6222e4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 608.243961] env[59369]: DEBUG oslo_concurrency.lockutils [req-edd15205-9a38-4a13-84e8-67d075a98a99 req-e850deb8-449d-4d4b-ae78-320af720d5ed service nova] Lock "02929ad5-b2f4-4a80-a606-2a0c9b6222e4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 608.244245] env[59369]: DEBUG nova.compute.manager [req-edd15205-9a38-4a13-84e8-67d075a98a99 req-e850deb8-449d-4d4b-ae78-320af720d5ed service nova] [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] No waiting events found dispatching network-vif-plugged-a1ad8c18-3ecc-4f30-bd35-ed52c7085368 {{(pid=59369) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 608.244245] env[59369]: WARNING nova.compute.manager [req-edd15205-9a38-4a13-84e8-67d075a98a99 req-e850deb8-449d-4d4b-ae78-320af720d5ed service nova] [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] Received unexpected event network-vif-plugged-a1ad8c18-3ecc-4f30-bd35-ed52c7085368 for instance with vm_state building and task_state spawning. [ 608.570363] env[59369]: DEBUG nova.network.neutron [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] [instance: eae87f89-8488-42e6-b065-1198bfbe8177] Successfully updated port: 6c523737-a1dc-4be3-a0b1-46b4186741ff {{(pid=59369) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 608.587530] env[59369]: DEBUG oslo_concurrency.lockutils [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] Acquiring lock "refresh_cache-eae87f89-8488-42e6-b065-1198bfbe8177" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 608.587530] env[59369]: DEBUG oslo_concurrency.lockutils [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] Acquired lock "refresh_cache-eae87f89-8488-42e6-b065-1198bfbe8177" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 608.587530] env[59369]: DEBUG nova.network.neutron [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] [instance: eae87f89-8488-42e6-b065-1198bfbe8177] Building network info cache for instance {{(pid=59369) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 608.678144] env[59369]: DEBUG nova.network.neutron [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] [instance: eae87f89-8488-42e6-b065-1198bfbe8177] Instance cache missing network info. {{(pid=59369) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 609.011607] env[59369]: DEBUG nova.network.neutron [None req-d0b49fb8-e92b-4b35-a5c2-a3aed9b6a081 tempest-ServerDiagnosticsNegativeTest-1630934522 tempest-ServerDiagnosticsNegativeTest-1630934522-project-member] [instance: d501202a-354c-42e1-8480-f026d5216a58] Successfully updated port: 92977e32-ac33-400b-a6d8-cd7726581aff {{(pid=59369) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 609.022102] env[59369]: DEBUG oslo_concurrency.lockutils [None req-d0b49fb8-e92b-4b35-a5c2-a3aed9b6a081 tempest-ServerDiagnosticsNegativeTest-1630934522 tempest-ServerDiagnosticsNegativeTest-1630934522-project-member] Acquiring lock "refresh_cache-d501202a-354c-42e1-8480-f026d5216a58" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 609.022303] env[59369]: DEBUG oslo_concurrency.lockutils [None req-d0b49fb8-e92b-4b35-a5c2-a3aed9b6a081 tempest-ServerDiagnosticsNegativeTest-1630934522 tempest-ServerDiagnosticsNegativeTest-1630934522-project-member] Acquired lock "refresh_cache-d501202a-354c-42e1-8480-f026d5216a58" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 609.022475] env[59369]: DEBUG nova.network.neutron [None req-d0b49fb8-e92b-4b35-a5c2-a3aed9b6a081 tempest-ServerDiagnosticsNegativeTest-1630934522 tempest-ServerDiagnosticsNegativeTest-1630934522-project-member] [instance: d501202a-354c-42e1-8480-f026d5216a58] Building network info cache for instance {{(pid=59369) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 609.112586] env[59369]: DEBUG nova.network.neutron [None req-d0b49fb8-e92b-4b35-a5c2-a3aed9b6a081 tempest-ServerDiagnosticsNegativeTest-1630934522 tempest-ServerDiagnosticsNegativeTest-1630934522-project-member] [instance: d501202a-354c-42e1-8480-f026d5216a58] Instance cache missing network info. {{(pid=59369) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 609.159119] env[59369]: DEBUG nova.network.neutron [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] [instance: eae87f89-8488-42e6-b065-1198bfbe8177] Updating instance_info_cache with network_info: [{"id": "6c523737-a1dc-4be3-a0b1-46b4186741ff", "address": "fa:16:3e:74:fd:25", "network": {"id": "b9eff4e8-f1ca-450e-848a-97a956388f68", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.51", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "7386534ab8334806bb50f77e095e084c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ad72c645-a67d-4efd-b563-28e44077e68d", "external-id": "nsx-vlan-transportzone-201", "segmentation_id": 201, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6c523737-a1", "ovs_interfaceid": "6c523737-a1dc-4be3-a0b1-46b4186741ff", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59369) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 609.178966] env[59369]: DEBUG oslo_concurrency.lockutils [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] Releasing lock "refresh_cache-eae87f89-8488-42e6-b065-1198bfbe8177" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 609.179360] env[59369]: DEBUG nova.compute.manager [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] [instance: eae87f89-8488-42e6-b065-1198bfbe8177] Instance network_info: |[{"id": "6c523737-a1dc-4be3-a0b1-46b4186741ff", "address": "fa:16:3e:74:fd:25", "network": {"id": "b9eff4e8-f1ca-450e-848a-97a956388f68", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.51", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "7386534ab8334806bb50f77e095e084c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ad72c645-a67d-4efd-b563-28e44077e68d", "external-id": "nsx-vlan-transportzone-201", "segmentation_id": 201, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6c523737-a1", "ovs_interfaceid": "6c523737-a1dc-4be3-a0b1-46b4186741ff", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59369) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 609.183448] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] [instance: eae87f89-8488-42e6-b065-1198bfbe8177] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:74:fd:25', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ad72c645-a67d-4efd-b563-28e44077e68d', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '6c523737-a1dc-4be3-a0b1-46b4186741ff', 'vif_model': 'vmxnet3'}] {{(pid=59369) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 609.191663] env[59369]: DEBUG nova.virt.vmwareapi.vm_util [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] Creating folder: Project (8b65b9c5302146b987bbec80e1b803bb). Parent ref: group-v121837. {{(pid=59369) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 609.192050] env[59369]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f60a46e4-6b94-499a-a528-5ec6a4f5fc60 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 609.203626] env[59369]: INFO nova.virt.vmwareapi.vm_util [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] Created folder: Project (8b65b9c5302146b987bbec80e1b803bb) in parent group-v121837. [ 609.203817] env[59369]: DEBUG nova.virt.vmwareapi.vm_util [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] Creating folder: Instances. Parent ref: group-v121847. {{(pid=59369) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 609.205194] env[59369]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-9d9330ba-9c89-4624-8b84-7dab3e72338a {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 609.216547] env[59369]: INFO nova.virt.vmwareapi.vm_util [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] Created folder: Instances in parent group-v121847. [ 609.216750] env[59369]: DEBUG oslo.service.loopingcall [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59369) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 609.220411] env[59369]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: eae87f89-8488-42e6-b065-1198bfbe8177] Creating VM on the ESX host {{(pid=59369) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 609.220733] env[59369]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-c8cc2469-988b-4c88-b3ec-37753554ebdf {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 609.241748] env[59369]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 609.241748] env[59369]: value = "task-463232" [ 609.241748] env[59369]: _type = "Task" [ 609.241748] env[59369]: } to complete. {{(pid=59369) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 609.249557] env[59369]: DEBUG oslo_vmware.api [-] Task: {'id': task-463232, 'name': CreateVM_Task} progress is 0%. {{(pid=59369) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 609.522155] env[59369]: DEBUG nova.network.neutron [None req-d0b49fb8-e92b-4b35-a5c2-a3aed9b6a081 tempest-ServerDiagnosticsNegativeTest-1630934522 tempest-ServerDiagnosticsNegativeTest-1630934522-project-member] [instance: d501202a-354c-42e1-8480-f026d5216a58] Updating instance_info_cache with network_info: [{"id": "92977e32-ac33-400b-a6d8-cd7726581aff", "address": "fa:16:3e:98:2f:34", "network": {"id": "b9eff4e8-f1ca-450e-848a-97a956388f68", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.70", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "7386534ab8334806bb50f77e095e084c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ad72c645-a67d-4efd-b563-28e44077e68d", "external-id": "nsx-vlan-transportzone-201", "segmentation_id": 201, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap92977e32-ac", "ovs_interfaceid": "92977e32-ac33-400b-a6d8-cd7726581aff", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59369) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 609.544579] env[59369]: DEBUG oslo_concurrency.lockutils [None req-d0b49fb8-e92b-4b35-a5c2-a3aed9b6a081 tempest-ServerDiagnosticsNegativeTest-1630934522 tempest-ServerDiagnosticsNegativeTest-1630934522-project-member] Releasing lock "refresh_cache-d501202a-354c-42e1-8480-f026d5216a58" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 609.544579] env[59369]: DEBUG nova.compute.manager [None req-d0b49fb8-e92b-4b35-a5c2-a3aed9b6a081 tempest-ServerDiagnosticsNegativeTest-1630934522 tempest-ServerDiagnosticsNegativeTest-1630934522-project-member] [instance: d501202a-354c-42e1-8480-f026d5216a58] Instance network_info: |[{"id": "92977e32-ac33-400b-a6d8-cd7726581aff", "address": "fa:16:3e:98:2f:34", "network": {"id": "b9eff4e8-f1ca-450e-848a-97a956388f68", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.70", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "7386534ab8334806bb50f77e095e084c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ad72c645-a67d-4efd-b563-28e44077e68d", "external-id": "nsx-vlan-transportzone-201", "segmentation_id": 201, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap92977e32-ac", "ovs_interfaceid": "92977e32-ac33-400b-a6d8-cd7726581aff", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59369) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 609.544814] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-d0b49fb8-e92b-4b35-a5c2-a3aed9b6a081 tempest-ServerDiagnosticsNegativeTest-1630934522 tempest-ServerDiagnosticsNegativeTest-1630934522-project-member] [instance: d501202a-354c-42e1-8480-f026d5216a58] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:98:2f:34', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ad72c645-a67d-4efd-b563-28e44077e68d', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '92977e32-ac33-400b-a6d8-cd7726581aff', 'vif_model': 'vmxnet3'}] {{(pid=59369) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 609.548561] env[59369]: DEBUG nova.virt.vmwareapi.vm_util [None req-d0b49fb8-e92b-4b35-a5c2-a3aed9b6a081 tempest-ServerDiagnosticsNegativeTest-1630934522 tempest-ServerDiagnosticsNegativeTest-1630934522-project-member] Creating folder: Project (e99a6ff60020444f9ec68d18f043c64f). Parent ref: group-v121837. {{(pid=59369) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 609.553374] env[59369]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f4d029e9-1b4d-4c51-a3b4-8f1c2f350ee3 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 609.563403] env[59369]: INFO nova.virt.vmwareapi.vm_util [None req-d0b49fb8-e92b-4b35-a5c2-a3aed9b6a081 tempest-ServerDiagnosticsNegativeTest-1630934522 tempest-ServerDiagnosticsNegativeTest-1630934522-project-member] Created folder: Project (e99a6ff60020444f9ec68d18f043c64f) in parent group-v121837. [ 609.563403] env[59369]: DEBUG nova.virt.vmwareapi.vm_util [None req-d0b49fb8-e92b-4b35-a5c2-a3aed9b6a081 tempest-ServerDiagnosticsNegativeTest-1630934522 tempest-ServerDiagnosticsNegativeTest-1630934522-project-member] Creating folder: Instances. Parent ref: group-v121850. {{(pid=59369) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 609.563896] env[59369]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-9cf0a2a7-bec4-44f5-862a-79ed3945f682 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 609.574016] env[59369]: INFO nova.virt.vmwareapi.vm_util [None req-d0b49fb8-e92b-4b35-a5c2-a3aed9b6a081 tempest-ServerDiagnosticsNegativeTest-1630934522 tempest-ServerDiagnosticsNegativeTest-1630934522-project-member] Created folder: Instances in parent group-v121850. [ 609.574016] env[59369]: DEBUG oslo.service.loopingcall [None req-d0b49fb8-e92b-4b35-a5c2-a3aed9b6a081 tempest-ServerDiagnosticsNegativeTest-1630934522 tempest-ServerDiagnosticsNegativeTest-1630934522-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59369) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 609.574016] env[59369]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d501202a-354c-42e1-8480-f026d5216a58] Creating VM on the ESX host {{(pid=59369) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 609.574016] env[59369]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-b30f4cb8-196c-4dac-8d4a-e20f8bd3624f {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 609.605186] env[59369]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 609.605186] env[59369]: value = "task-463235" [ 609.605186] env[59369]: _type = "Task" [ 609.605186] env[59369]: } to complete. {{(pid=59369) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 609.615322] env[59369]: DEBUG oslo_vmware.api [-] Task: {'id': task-463235, 'name': CreateVM_Task} progress is 0%. {{(pid=59369) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 609.755035] env[59369]: DEBUG oslo_vmware.api [-] Task: {'id': task-463232, 'name': CreateVM_Task, 'duration_secs': 0.315282} completed successfully. {{(pid=59369) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 609.755035] env[59369]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: eae87f89-8488-42e6-b065-1198bfbe8177] Created VM on the ESX host {{(pid=59369) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 609.755035] env[59369]: DEBUG oslo_concurrency.lockutils [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/dc3bad91-06a4-485e-bdc4-b0135384e686" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 609.755035] env[59369]: DEBUG oslo_concurrency.lockutils [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] Acquired lock "[datastore1] devstack-image-cache_base/dc3bad91-06a4-485e-bdc4-b0135384e686" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 609.755035] env[59369]: DEBUG oslo_concurrency.lockutils [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/dc3bad91-06a4-485e-bdc4-b0135384e686" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 609.755402] env[59369]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-f76757d7-ce9c-42fe-8a3d-2818a4f2de91 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 609.758870] env[59369]: DEBUG oslo_vmware.api [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] Waiting for the task: (returnval){ [ 609.758870] env[59369]: value = "session[52794835-9974-5e05-43d4-e917bd2adbc9]5278d80d-de40-b92e-04fb-4b0a96a039ac" [ 609.758870] env[59369]: _type = "Task" [ 609.758870] env[59369]: } to complete. {{(pid=59369) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 609.769327] env[59369]: DEBUG oslo_vmware.api [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] Task: {'id': session[52794835-9974-5e05-43d4-e917bd2adbc9]5278d80d-de40-b92e-04fb-4b0a96a039ac, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59369) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 610.114936] env[59369]: DEBUG oslo_vmware.api [-] Task: {'id': task-463235, 'name': CreateVM_Task, 'duration_secs': 0.305876} completed successfully. {{(pid=59369) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 610.115381] env[59369]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d501202a-354c-42e1-8480-f026d5216a58] Created VM on the ESX host {{(pid=59369) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 610.115823] env[59369]: DEBUG oslo_concurrency.lockutils [None req-d0b49fb8-e92b-4b35-a5c2-a3aed9b6a081 tempest-ServerDiagnosticsNegativeTest-1630934522 tempest-ServerDiagnosticsNegativeTest-1630934522-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/dc3bad91-06a4-485e-bdc4-b0135384e686" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 610.272155] env[59369]: DEBUG oslo_concurrency.lockutils [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] Releasing lock "[datastore1] devstack-image-cache_base/dc3bad91-06a4-485e-bdc4-b0135384e686" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 610.272409] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] [instance: eae87f89-8488-42e6-b065-1198bfbe8177] Processing image dc3bad91-06a4-485e-bdc4-b0135384e686 {{(pid=59369) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 610.272901] env[59369]: DEBUG oslo_concurrency.lockutils [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/dc3bad91-06a4-485e-bdc4-b0135384e686/dc3bad91-06a4-485e-bdc4-b0135384e686.vmdk" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 610.272901] env[59369]: DEBUG oslo_concurrency.lockutils [None req-d0b49fb8-e92b-4b35-a5c2-a3aed9b6a081 tempest-ServerDiagnosticsNegativeTest-1630934522 tempest-ServerDiagnosticsNegativeTest-1630934522-project-member] Acquired lock "[datastore1] devstack-image-cache_base/dc3bad91-06a4-485e-bdc4-b0135384e686" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 610.273942] env[59369]: DEBUG oslo_concurrency.lockutils [None req-d0b49fb8-e92b-4b35-a5c2-a3aed9b6a081 tempest-ServerDiagnosticsNegativeTest-1630934522 tempest-ServerDiagnosticsNegativeTest-1630934522-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/dc3bad91-06a4-485e-bdc4-b0135384e686" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 610.273942] env[59369]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-940eb221-ba6f-47fe-a013-b66b58c40399 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 610.278918] env[59369]: DEBUG oslo_vmware.api [None req-d0b49fb8-e92b-4b35-a5c2-a3aed9b6a081 tempest-ServerDiagnosticsNegativeTest-1630934522 tempest-ServerDiagnosticsNegativeTest-1630934522-project-member] Waiting for the task: (returnval){ [ 610.278918] env[59369]: value = "session[52794835-9974-5e05-43d4-e917bd2adbc9]523d75f4-4c4f-c102-3be6-69ed548347d4" [ 610.278918] env[59369]: _type = "Task" [ 610.278918] env[59369]: } to complete. {{(pid=59369) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 610.287339] env[59369]: DEBUG oslo_vmware.api [None req-d0b49fb8-e92b-4b35-a5c2-a3aed9b6a081 tempest-ServerDiagnosticsNegativeTest-1630934522 tempest-ServerDiagnosticsNegativeTest-1630934522-project-member] Task: {'id': session[52794835-9974-5e05-43d4-e917bd2adbc9]523d75f4-4c4f-c102-3be6-69ed548347d4, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59369) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 610.419575] env[59369]: DEBUG oslo_service.periodic_task [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Running periodic task ComputeManager._sync_power_states {{(pid=59369) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 610.454595] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Getting list of instances from cluster (obj){ [ 610.454595] env[59369]: value = "domain-c8" [ 610.454595] env[59369]: _type = "ClusterComputeResource" [ 610.454595] env[59369]: } {{(pid=59369) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 610.457677] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f651b1a0-1db3-45cb-af43-86f1103461c9 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 610.472759] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Got total of 5 instances {{(pid=59369) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 610.472969] env[59369]: DEBUG nova.compute.manager [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Triggering sync for uuid 02929ad5-b2f4-4a80-a606-2a0c9b6222e4 {{(pid=59369) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10222}} [ 610.473175] env[59369]: DEBUG nova.compute.manager [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Triggering sync for uuid 31c17493-263e-4d08-b669-e901b07060d5 {{(pid=59369) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10222}} [ 610.473334] env[59369]: DEBUG nova.compute.manager [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Triggering sync for uuid b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0 {{(pid=59369) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10222}} [ 610.473510] env[59369]: DEBUG nova.compute.manager [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Triggering sync for uuid eae87f89-8488-42e6-b065-1198bfbe8177 {{(pid=59369) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10222}} [ 610.473659] env[59369]: DEBUG nova.compute.manager [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Triggering sync for uuid d501202a-354c-42e1-8480-f026d5216a58 {{(pid=59369) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10222}} [ 610.473945] env[59369]: DEBUG oslo_concurrency.lockutils [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Acquiring lock "02929ad5-b2f4-4a80-a606-2a0c9b6222e4" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 610.474189] env[59369]: DEBUG oslo_concurrency.lockutils [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Acquiring lock "31c17493-263e-4d08-b669-e901b07060d5" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 610.474370] env[59369]: DEBUG oslo_concurrency.lockutils [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Acquiring lock "b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 610.474546] env[59369]: DEBUG oslo_concurrency.lockutils [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Acquiring lock "eae87f89-8488-42e6-b065-1198bfbe8177" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 610.474720] env[59369]: DEBUG oslo_concurrency.lockutils [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Acquiring lock "d501202a-354c-42e1-8480-f026d5216a58" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 610.474886] env[59369]: DEBUG oslo_service.periodic_task [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Running periodic task ComputeManager._cleanup_running_deleted_instances {{(pid=59369) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 610.475272] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Getting list of instances from cluster (obj){ [ 610.475272] env[59369]: value = "domain-c8" [ 610.475272] env[59369]: _type = "ClusterComputeResource" [ 610.475272] env[59369]: } {{(pid=59369) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 610.476181] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c33da0de-a7ae-48e2-8d1e-d93563102539 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 610.489301] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Got total of 5 instances {{(pid=59369) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 610.790351] env[59369]: DEBUG oslo_concurrency.lockutils [None req-d0b49fb8-e92b-4b35-a5c2-a3aed9b6a081 tempest-ServerDiagnosticsNegativeTest-1630934522 tempest-ServerDiagnosticsNegativeTest-1630934522-project-member] Releasing lock "[datastore1] devstack-image-cache_base/dc3bad91-06a4-485e-bdc4-b0135384e686" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 610.790460] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-d0b49fb8-e92b-4b35-a5c2-a3aed9b6a081 tempest-ServerDiagnosticsNegativeTest-1630934522 tempest-ServerDiagnosticsNegativeTest-1630934522-project-member] [instance: d501202a-354c-42e1-8480-f026d5216a58] Processing image dc3bad91-06a4-485e-bdc4-b0135384e686 {{(pid=59369) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 610.790695] env[59369]: DEBUG oslo_concurrency.lockutils [None req-d0b49fb8-e92b-4b35-a5c2-a3aed9b6a081 tempest-ServerDiagnosticsNegativeTest-1630934522 tempest-ServerDiagnosticsNegativeTest-1630934522-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/dc3bad91-06a4-485e-bdc4-b0135384e686/dc3bad91-06a4-485e-bdc4-b0135384e686.vmdk" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 612.142457] env[59369]: DEBUG nova.compute.manager [req-69cb9cf4-d38a-4fde-813a-cb734a41774a req-16822e4e-f1f2-438f-955f-69a64f67a7ed service nova] [instance: 31c17493-263e-4d08-b669-e901b07060d5] Received event network-changed-bbe86fc8-b664-48e5-b16f-944d69cf7ecd {{(pid=59369) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 612.142936] env[59369]: DEBUG nova.compute.manager [req-69cb9cf4-d38a-4fde-813a-cb734a41774a req-16822e4e-f1f2-438f-955f-69a64f67a7ed service nova] [instance: 31c17493-263e-4d08-b669-e901b07060d5] Refreshing instance network info cache due to event network-changed-bbe86fc8-b664-48e5-b16f-944d69cf7ecd. {{(pid=59369) external_instance_event /opt/stack/nova/nova/compute/manager.py:11007}} [ 612.142936] env[59369]: DEBUG oslo_concurrency.lockutils [req-69cb9cf4-d38a-4fde-813a-cb734a41774a req-16822e4e-f1f2-438f-955f-69a64f67a7ed service nova] Acquiring lock "refresh_cache-31c17493-263e-4d08-b669-e901b07060d5" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 612.143054] env[59369]: DEBUG oslo_concurrency.lockutils [req-69cb9cf4-d38a-4fde-813a-cb734a41774a req-16822e4e-f1f2-438f-955f-69a64f67a7ed service nova] Acquired lock "refresh_cache-31c17493-263e-4d08-b669-e901b07060d5" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 612.143160] env[59369]: DEBUG nova.network.neutron [req-69cb9cf4-d38a-4fde-813a-cb734a41774a req-16822e4e-f1f2-438f-955f-69a64f67a7ed service nova] [instance: 31c17493-263e-4d08-b669-e901b07060d5] Refreshing network info cache for port bbe86fc8-b664-48e5-b16f-944d69cf7ecd {{(pid=59369) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 612.394757] env[59369]: DEBUG nova.compute.manager [req-4a503809-4858-4b63-931a-81ae4ed8aed0 req-2e42c0c8-51da-49eb-a49c-e9f062bcc081 service nova] [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] Received event network-vif-plugged-eddd35f6-cd0f-4e6a-8b7b-4d3b4b8ccb91 {{(pid=59369) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 612.395381] env[59369]: DEBUG oslo_concurrency.lockutils [req-4a503809-4858-4b63-931a-81ae4ed8aed0 req-2e42c0c8-51da-49eb-a49c-e9f062bcc081 service nova] Acquiring lock "b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 612.396467] env[59369]: DEBUG oslo_concurrency.lockutils [req-4a503809-4858-4b63-931a-81ae4ed8aed0 req-2e42c0c8-51da-49eb-a49c-e9f062bcc081 service nova] Lock "b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 612.396467] env[59369]: DEBUG oslo_concurrency.lockutils [req-4a503809-4858-4b63-931a-81ae4ed8aed0 req-2e42c0c8-51da-49eb-a49c-e9f062bcc081 service nova] Lock "b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 612.396467] env[59369]: DEBUG nova.compute.manager [req-4a503809-4858-4b63-931a-81ae4ed8aed0 req-2e42c0c8-51da-49eb-a49c-e9f062bcc081 service nova] [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] No waiting events found dispatching network-vif-plugged-eddd35f6-cd0f-4e6a-8b7b-4d3b4b8ccb91 {{(pid=59369) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 612.396467] env[59369]: WARNING nova.compute.manager [req-4a503809-4858-4b63-931a-81ae4ed8aed0 req-2e42c0c8-51da-49eb-a49c-e9f062bcc081 service nova] [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] Received unexpected event network-vif-plugged-eddd35f6-cd0f-4e6a-8b7b-4d3b4b8ccb91 for instance with vm_state building and task_state spawning. [ 612.398429] env[59369]: DEBUG nova.compute.manager [req-4a503809-4858-4b63-931a-81ae4ed8aed0 req-2e42c0c8-51da-49eb-a49c-e9f062bcc081 service nova] [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] Received event network-changed-eddd35f6-cd0f-4e6a-8b7b-4d3b4b8ccb91 {{(pid=59369) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 612.398429] env[59369]: DEBUG nova.compute.manager [req-4a503809-4858-4b63-931a-81ae4ed8aed0 req-2e42c0c8-51da-49eb-a49c-e9f062bcc081 service nova] [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] Refreshing instance network info cache due to event network-changed-eddd35f6-cd0f-4e6a-8b7b-4d3b4b8ccb91. {{(pid=59369) external_instance_event /opt/stack/nova/nova/compute/manager.py:11007}} [ 612.398429] env[59369]: DEBUG oslo_concurrency.lockutils [req-4a503809-4858-4b63-931a-81ae4ed8aed0 req-2e42c0c8-51da-49eb-a49c-e9f062bcc081 service nova] Acquiring lock "refresh_cache-b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 612.398429] env[59369]: DEBUG oslo_concurrency.lockutils [req-4a503809-4858-4b63-931a-81ae4ed8aed0 req-2e42c0c8-51da-49eb-a49c-e9f062bcc081 service nova] Acquired lock "refresh_cache-b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 612.398429] env[59369]: DEBUG nova.network.neutron [req-4a503809-4858-4b63-931a-81ae4ed8aed0 req-2e42c0c8-51da-49eb-a49c-e9f062bcc081 service nova] [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] Refreshing network info cache for port eddd35f6-cd0f-4e6a-8b7b-4d3b4b8ccb91 {{(pid=59369) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 613.238890] env[59369]: DEBUG nova.network.neutron [req-69cb9cf4-d38a-4fde-813a-cb734a41774a req-16822e4e-f1f2-438f-955f-69a64f67a7ed service nova] [instance: 31c17493-263e-4d08-b669-e901b07060d5] Updated VIF entry in instance network info cache for port bbe86fc8-b664-48e5-b16f-944d69cf7ecd. {{(pid=59369) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 613.239367] env[59369]: DEBUG nova.network.neutron [req-69cb9cf4-d38a-4fde-813a-cb734a41774a req-16822e4e-f1f2-438f-955f-69a64f67a7ed service nova] [instance: 31c17493-263e-4d08-b669-e901b07060d5] Updating instance_info_cache with network_info: [{"id": "bbe86fc8-b664-48e5-b16f-944d69cf7ecd", "address": "fa:16:3e:16:e3:63", "network": {"id": "5bf3b46a-5cae-491d-ab4e-f9774d11c1d8", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-332625799-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "2eeec6feafe84002a6fc8c2050a1b3e4", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "209639b9-c313-4b35-86dc-dccd744d174a", "external-id": "nsx-vlan-transportzone-868", "segmentation_id": 868, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapbbe86fc8-b6", "ovs_interfaceid": "bbe86fc8-b664-48e5-b16f-944d69cf7ecd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59369) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 613.251157] env[59369]: DEBUG oslo_concurrency.lockutils [req-69cb9cf4-d38a-4fde-813a-cb734a41774a req-16822e4e-f1f2-438f-955f-69a64f67a7ed service nova] Releasing lock "refresh_cache-31c17493-263e-4d08-b669-e901b07060d5" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 613.311542] env[59369]: DEBUG nova.network.neutron [req-4a503809-4858-4b63-931a-81ae4ed8aed0 req-2e42c0c8-51da-49eb-a49c-e9f062bcc081 service nova] [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] Updated VIF entry in instance network info cache for port eddd35f6-cd0f-4e6a-8b7b-4d3b4b8ccb91. {{(pid=59369) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 613.311622] env[59369]: DEBUG nova.network.neutron [req-4a503809-4858-4b63-931a-81ae4ed8aed0 req-2e42c0c8-51da-49eb-a49c-e9f062bcc081 service nova] [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] Updating instance_info_cache with network_info: [{"id": "eddd35f6-cd0f-4e6a-8b7b-4d3b4b8ccb91", "address": "fa:16:3e:37:ad:42", "network": {"id": "b9eff4e8-f1ca-450e-848a-97a956388f68", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.231", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "7386534ab8334806bb50f77e095e084c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ad72c645-a67d-4efd-b563-28e44077e68d", "external-id": "nsx-vlan-transportzone-201", "segmentation_id": 201, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapeddd35f6-cd", "ovs_interfaceid": "eddd35f6-cd0f-4e6a-8b7b-4d3b4b8ccb91", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59369) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 613.323675] env[59369]: DEBUG oslo_concurrency.lockutils [req-4a503809-4858-4b63-931a-81ae4ed8aed0 req-2e42c0c8-51da-49eb-a49c-e9f062bcc081 service nova] Releasing lock "refresh_cache-b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 613.323819] env[59369]: DEBUG nova.compute.manager [req-4a503809-4858-4b63-931a-81ae4ed8aed0 req-2e42c0c8-51da-49eb-a49c-e9f062bcc081 service nova] [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] Received event network-changed-a1ad8c18-3ecc-4f30-bd35-ed52c7085368 {{(pid=59369) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 613.324451] env[59369]: DEBUG nova.compute.manager [req-4a503809-4858-4b63-931a-81ae4ed8aed0 req-2e42c0c8-51da-49eb-a49c-e9f062bcc081 service nova] [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] Refreshing instance network info cache due to event network-changed-a1ad8c18-3ecc-4f30-bd35-ed52c7085368. {{(pid=59369) external_instance_event /opt/stack/nova/nova/compute/manager.py:11007}} [ 613.324451] env[59369]: DEBUG oslo_concurrency.lockutils [req-4a503809-4858-4b63-931a-81ae4ed8aed0 req-2e42c0c8-51da-49eb-a49c-e9f062bcc081 service nova] Acquiring lock "refresh_cache-02929ad5-b2f4-4a80-a606-2a0c9b6222e4" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 613.324451] env[59369]: DEBUG oslo_concurrency.lockutils [req-4a503809-4858-4b63-931a-81ae4ed8aed0 req-2e42c0c8-51da-49eb-a49c-e9f062bcc081 service nova] Acquired lock "refresh_cache-02929ad5-b2f4-4a80-a606-2a0c9b6222e4" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 613.324540] env[59369]: DEBUG nova.network.neutron [req-4a503809-4858-4b63-931a-81ae4ed8aed0 req-2e42c0c8-51da-49eb-a49c-e9f062bcc081 service nova] [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] Refreshing network info cache for port a1ad8c18-3ecc-4f30-bd35-ed52c7085368 {{(pid=59369) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 613.898217] env[59369]: DEBUG nova.network.neutron [req-4a503809-4858-4b63-931a-81ae4ed8aed0 req-2e42c0c8-51da-49eb-a49c-e9f062bcc081 service nova] [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] Updated VIF entry in instance network info cache for port a1ad8c18-3ecc-4f30-bd35-ed52c7085368. {{(pid=59369) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 613.898592] env[59369]: DEBUG nova.network.neutron [req-4a503809-4858-4b63-931a-81ae4ed8aed0 req-2e42c0c8-51da-49eb-a49c-e9f062bcc081 service nova] [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] Updating instance_info_cache with network_info: [{"id": "a1ad8c18-3ecc-4f30-bd35-ed52c7085368", "address": "fa:16:3e:2c:95:75", "network": {"id": "b9eff4e8-f1ca-450e-848a-97a956388f68", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.235", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "7386534ab8334806bb50f77e095e084c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ad72c645-a67d-4efd-b563-28e44077e68d", "external-id": "nsx-vlan-transportzone-201", "segmentation_id": 201, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa1ad8c18-3e", "ovs_interfaceid": "a1ad8c18-3ecc-4f30-bd35-ed52c7085368", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59369) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 613.918296] env[59369]: DEBUG oslo_concurrency.lockutils [req-4a503809-4858-4b63-931a-81ae4ed8aed0 req-2e42c0c8-51da-49eb-a49c-e9f062bcc081 service nova] Releasing lock "refresh_cache-02929ad5-b2f4-4a80-a606-2a0c9b6222e4" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 613.918296] env[59369]: DEBUG nova.compute.manager [req-4a503809-4858-4b63-931a-81ae4ed8aed0 req-2e42c0c8-51da-49eb-a49c-e9f062bcc081 service nova] [instance: eae87f89-8488-42e6-b065-1198bfbe8177] Received event network-vif-plugged-6c523737-a1dc-4be3-a0b1-46b4186741ff {{(pid=59369) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 613.918296] env[59369]: DEBUG oslo_concurrency.lockutils [req-4a503809-4858-4b63-931a-81ae4ed8aed0 req-2e42c0c8-51da-49eb-a49c-e9f062bcc081 service nova] Acquiring lock "eae87f89-8488-42e6-b065-1198bfbe8177-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 613.918296] env[59369]: DEBUG oslo_concurrency.lockutils [req-4a503809-4858-4b63-931a-81ae4ed8aed0 req-2e42c0c8-51da-49eb-a49c-e9f062bcc081 service nova] Lock "eae87f89-8488-42e6-b065-1198bfbe8177-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 613.918498] env[59369]: DEBUG oslo_concurrency.lockutils [req-4a503809-4858-4b63-931a-81ae4ed8aed0 req-2e42c0c8-51da-49eb-a49c-e9f062bcc081 service nova] Lock "eae87f89-8488-42e6-b065-1198bfbe8177-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 613.918498] env[59369]: DEBUG nova.compute.manager [req-4a503809-4858-4b63-931a-81ae4ed8aed0 req-2e42c0c8-51da-49eb-a49c-e9f062bcc081 service nova] [instance: eae87f89-8488-42e6-b065-1198bfbe8177] No waiting events found dispatching network-vif-plugged-6c523737-a1dc-4be3-a0b1-46b4186741ff {{(pid=59369) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 613.918498] env[59369]: WARNING nova.compute.manager [req-4a503809-4858-4b63-931a-81ae4ed8aed0 req-2e42c0c8-51da-49eb-a49c-e9f062bcc081 service nova] [instance: eae87f89-8488-42e6-b065-1198bfbe8177] Received unexpected event network-vif-plugged-6c523737-a1dc-4be3-a0b1-46b4186741ff for instance with vm_state building and task_state spawning. [ 613.918498] env[59369]: DEBUG nova.compute.manager [req-4a503809-4858-4b63-931a-81ae4ed8aed0 req-2e42c0c8-51da-49eb-a49c-e9f062bcc081 service nova] [instance: eae87f89-8488-42e6-b065-1198bfbe8177] Received event network-changed-6c523737-a1dc-4be3-a0b1-46b4186741ff {{(pid=59369) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 613.918633] env[59369]: DEBUG nova.compute.manager [req-4a503809-4858-4b63-931a-81ae4ed8aed0 req-2e42c0c8-51da-49eb-a49c-e9f062bcc081 service nova] [instance: eae87f89-8488-42e6-b065-1198bfbe8177] Refreshing instance network info cache due to event network-changed-6c523737-a1dc-4be3-a0b1-46b4186741ff. {{(pid=59369) external_instance_event /opt/stack/nova/nova/compute/manager.py:11007}} [ 613.918633] env[59369]: DEBUG oslo_concurrency.lockutils [req-4a503809-4858-4b63-931a-81ae4ed8aed0 req-2e42c0c8-51da-49eb-a49c-e9f062bcc081 service nova] Acquiring lock "refresh_cache-eae87f89-8488-42e6-b065-1198bfbe8177" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 613.918633] env[59369]: DEBUG oslo_concurrency.lockutils [req-4a503809-4858-4b63-931a-81ae4ed8aed0 req-2e42c0c8-51da-49eb-a49c-e9f062bcc081 service nova] Acquired lock "refresh_cache-eae87f89-8488-42e6-b065-1198bfbe8177" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 613.918633] env[59369]: DEBUG nova.network.neutron [req-4a503809-4858-4b63-931a-81ae4ed8aed0 req-2e42c0c8-51da-49eb-a49c-e9f062bcc081 service nova] [instance: eae87f89-8488-42e6-b065-1198bfbe8177] Refreshing network info cache for port 6c523737-a1dc-4be3-a0b1-46b4186741ff {{(pid=59369) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 614.440996] env[59369]: DEBUG nova.network.neutron [req-4a503809-4858-4b63-931a-81ae4ed8aed0 req-2e42c0c8-51da-49eb-a49c-e9f062bcc081 service nova] [instance: eae87f89-8488-42e6-b065-1198bfbe8177] Updated VIF entry in instance network info cache for port 6c523737-a1dc-4be3-a0b1-46b4186741ff. {{(pid=59369) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 614.441440] env[59369]: DEBUG nova.network.neutron [req-4a503809-4858-4b63-931a-81ae4ed8aed0 req-2e42c0c8-51da-49eb-a49c-e9f062bcc081 service nova] [instance: eae87f89-8488-42e6-b065-1198bfbe8177] Updating instance_info_cache with network_info: [{"id": "6c523737-a1dc-4be3-a0b1-46b4186741ff", "address": "fa:16:3e:74:fd:25", "network": {"id": "b9eff4e8-f1ca-450e-848a-97a956388f68", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.51", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "7386534ab8334806bb50f77e095e084c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ad72c645-a67d-4efd-b563-28e44077e68d", "external-id": "nsx-vlan-transportzone-201", "segmentation_id": 201, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6c523737-a1", "ovs_interfaceid": "6c523737-a1dc-4be3-a0b1-46b4186741ff", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59369) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 614.455790] env[59369]: DEBUG oslo_concurrency.lockutils [req-4a503809-4858-4b63-931a-81ae4ed8aed0 req-2e42c0c8-51da-49eb-a49c-e9f062bcc081 service nova] Releasing lock "refresh_cache-eae87f89-8488-42e6-b065-1198bfbe8177" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 614.457889] env[59369]: DEBUG nova.compute.manager [req-4a503809-4858-4b63-931a-81ae4ed8aed0 req-2e42c0c8-51da-49eb-a49c-e9f062bcc081 service nova] [instance: d501202a-354c-42e1-8480-f026d5216a58] Received event network-vif-plugged-92977e32-ac33-400b-a6d8-cd7726581aff {{(pid=59369) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 614.457889] env[59369]: DEBUG oslo_concurrency.lockutils [req-4a503809-4858-4b63-931a-81ae4ed8aed0 req-2e42c0c8-51da-49eb-a49c-e9f062bcc081 service nova] Acquiring lock "d501202a-354c-42e1-8480-f026d5216a58-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 614.457889] env[59369]: DEBUG oslo_concurrency.lockutils [req-4a503809-4858-4b63-931a-81ae4ed8aed0 req-2e42c0c8-51da-49eb-a49c-e9f062bcc081 service nova] Lock "d501202a-354c-42e1-8480-f026d5216a58-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 614.457889] env[59369]: DEBUG oslo_concurrency.lockutils [req-4a503809-4858-4b63-931a-81ae4ed8aed0 req-2e42c0c8-51da-49eb-a49c-e9f062bcc081 service nova] Lock "d501202a-354c-42e1-8480-f026d5216a58-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 614.458160] env[59369]: DEBUG nova.compute.manager [req-4a503809-4858-4b63-931a-81ae4ed8aed0 req-2e42c0c8-51da-49eb-a49c-e9f062bcc081 service nova] [instance: d501202a-354c-42e1-8480-f026d5216a58] No waiting events found dispatching network-vif-plugged-92977e32-ac33-400b-a6d8-cd7726581aff {{(pid=59369) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 614.458160] env[59369]: WARNING nova.compute.manager [req-4a503809-4858-4b63-931a-81ae4ed8aed0 req-2e42c0c8-51da-49eb-a49c-e9f062bcc081 service nova] [instance: d501202a-354c-42e1-8480-f026d5216a58] Received unexpected event network-vif-plugged-92977e32-ac33-400b-a6d8-cd7726581aff for instance with vm_state building and task_state spawning. [ 614.458160] env[59369]: DEBUG nova.compute.manager [req-4a503809-4858-4b63-931a-81ae4ed8aed0 req-2e42c0c8-51da-49eb-a49c-e9f062bcc081 service nova] [instance: d501202a-354c-42e1-8480-f026d5216a58] Received event network-changed-92977e32-ac33-400b-a6d8-cd7726581aff {{(pid=59369) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 614.458160] env[59369]: DEBUG nova.compute.manager [req-4a503809-4858-4b63-931a-81ae4ed8aed0 req-2e42c0c8-51da-49eb-a49c-e9f062bcc081 service nova] [instance: d501202a-354c-42e1-8480-f026d5216a58] Refreshing instance network info cache due to event network-changed-92977e32-ac33-400b-a6d8-cd7726581aff. {{(pid=59369) external_instance_event /opt/stack/nova/nova/compute/manager.py:11007}} [ 614.458160] env[59369]: DEBUG oslo_concurrency.lockutils [req-4a503809-4858-4b63-931a-81ae4ed8aed0 req-2e42c0c8-51da-49eb-a49c-e9f062bcc081 service nova] Acquiring lock "refresh_cache-d501202a-354c-42e1-8480-f026d5216a58" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 614.458434] env[59369]: DEBUG oslo_concurrency.lockutils [req-4a503809-4858-4b63-931a-81ae4ed8aed0 req-2e42c0c8-51da-49eb-a49c-e9f062bcc081 service nova] Acquired lock "refresh_cache-d501202a-354c-42e1-8480-f026d5216a58" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 614.458434] env[59369]: DEBUG nova.network.neutron [req-4a503809-4858-4b63-931a-81ae4ed8aed0 req-2e42c0c8-51da-49eb-a49c-e9f062bcc081 service nova] [instance: d501202a-354c-42e1-8480-f026d5216a58] Refreshing network info cache for port 92977e32-ac33-400b-a6d8-cd7726581aff {{(pid=59369) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 614.804643] env[59369]: DEBUG oslo_concurrency.lockutils [None req-4b489d12-c9ea-48cd-8cc1-de9ad8caab5d tempest-ServersAdminNegativeTestJSON-1608821687 tempest-ServersAdminNegativeTestJSON-1608821687-project-member] Acquiring lock "ad662e06-1a0f-4110-8d23-8ff6c6889eee" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 614.804973] env[59369]: DEBUG oslo_concurrency.lockutils [None req-4b489d12-c9ea-48cd-8cc1-de9ad8caab5d tempest-ServersAdminNegativeTestJSON-1608821687 tempest-ServersAdminNegativeTestJSON-1608821687-project-member] Lock "ad662e06-1a0f-4110-8d23-8ff6c6889eee" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 614.820142] env[59369]: DEBUG nova.compute.manager [None req-4b489d12-c9ea-48cd-8cc1-de9ad8caab5d tempest-ServersAdminNegativeTestJSON-1608821687 tempest-ServersAdminNegativeTestJSON-1608821687-project-member] [instance: ad662e06-1a0f-4110-8d23-8ff6c6889eee] Starting instance... {{(pid=59369) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 614.880288] env[59369]: DEBUG oslo_concurrency.lockutils [None req-4b489d12-c9ea-48cd-8cc1-de9ad8caab5d tempest-ServersAdminNegativeTestJSON-1608821687 tempest-ServersAdminNegativeTestJSON-1608821687-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 614.880522] env[59369]: DEBUG oslo_concurrency.lockutils [None req-4b489d12-c9ea-48cd-8cc1-de9ad8caab5d tempest-ServersAdminNegativeTestJSON-1608821687 tempest-ServersAdminNegativeTestJSON-1608821687-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 614.881971] env[59369]: INFO nova.compute.claims [None req-4b489d12-c9ea-48cd-8cc1-de9ad8caab5d tempest-ServersAdminNegativeTestJSON-1608821687 tempest-ServersAdminNegativeTestJSON-1608821687-project-member] [instance: ad662e06-1a0f-4110-8d23-8ff6c6889eee] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 615.047205] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-772b15ec-0475-4110-9c32-3a0e23d28305 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 615.054696] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7e889932-19fb-4b49-91cd-9aec6dd8d955 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 615.087785] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-88d49396-9fa5-44e3-a939-1432069c134d {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 615.096095] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e6686465-5ee9-474f-bb33-541f80b7384f {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 615.112424] env[59369]: DEBUG nova.compute.provider_tree [None req-4b489d12-c9ea-48cd-8cc1-de9ad8caab5d tempest-ServersAdminNegativeTestJSON-1608821687 tempest-ServersAdminNegativeTestJSON-1608821687-project-member] Inventory has not changed in ProviderTree for provider: 6ace6145-3535-4f74-aa29-80f64a201369 {{(pid=59369) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 615.119735] env[59369]: DEBUG nova.scheduler.client.report [None req-4b489d12-c9ea-48cd-8cc1-de9ad8caab5d tempest-ServersAdminNegativeTestJSON-1608821687 tempest-ServersAdminNegativeTestJSON-1608821687-project-member] Inventory has not changed for provider 6ace6145-3535-4f74-aa29-80f64a201369 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 116, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59369) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 615.140902] env[59369]: DEBUG oslo_concurrency.lockutils [None req-4b489d12-c9ea-48cd-8cc1-de9ad8caab5d tempest-ServersAdminNegativeTestJSON-1608821687 tempest-ServersAdminNegativeTestJSON-1608821687-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.260s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 615.141415] env[59369]: DEBUG nova.compute.manager [None req-4b489d12-c9ea-48cd-8cc1-de9ad8caab5d tempest-ServersAdminNegativeTestJSON-1608821687 tempest-ServersAdminNegativeTestJSON-1608821687-project-member] [instance: ad662e06-1a0f-4110-8d23-8ff6c6889eee] Start building networks asynchronously for instance. {{(pid=59369) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 615.197846] env[59369]: DEBUG nova.compute.utils [None req-4b489d12-c9ea-48cd-8cc1-de9ad8caab5d tempest-ServersAdminNegativeTestJSON-1608821687 tempest-ServersAdminNegativeTestJSON-1608821687-project-member] Using /dev/sd instead of None {{(pid=59369) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 615.199228] env[59369]: DEBUG nova.compute.manager [None req-4b489d12-c9ea-48cd-8cc1-de9ad8caab5d tempest-ServersAdminNegativeTestJSON-1608821687 tempest-ServersAdminNegativeTestJSON-1608821687-project-member] [instance: ad662e06-1a0f-4110-8d23-8ff6c6889eee] Allocating IP information in the background. {{(pid=59369) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 615.199426] env[59369]: DEBUG nova.network.neutron [None req-4b489d12-c9ea-48cd-8cc1-de9ad8caab5d tempest-ServersAdminNegativeTestJSON-1608821687 tempest-ServersAdminNegativeTestJSON-1608821687-project-member] [instance: ad662e06-1a0f-4110-8d23-8ff6c6889eee] allocate_for_instance() {{(pid=59369) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 615.209941] env[59369]: DEBUG nova.compute.manager [None req-4b489d12-c9ea-48cd-8cc1-de9ad8caab5d tempest-ServersAdminNegativeTestJSON-1608821687 tempest-ServersAdminNegativeTestJSON-1608821687-project-member] [instance: ad662e06-1a0f-4110-8d23-8ff6c6889eee] Start building block device mappings for instance. {{(pid=59369) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 615.281355] env[59369]: DEBUG nova.network.neutron [req-4a503809-4858-4b63-931a-81ae4ed8aed0 req-2e42c0c8-51da-49eb-a49c-e9f062bcc081 service nova] [instance: d501202a-354c-42e1-8480-f026d5216a58] Updated VIF entry in instance network info cache for port 92977e32-ac33-400b-a6d8-cd7726581aff. {{(pid=59369) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 615.281691] env[59369]: DEBUG nova.network.neutron [req-4a503809-4858-4b63-931a-81ae4ed8aed0 req-2e42c0c8-51da-49eb-a49c-e9f062bcc081 service nova] [instance: d501202a-354c-42e1-8480-f026d5216a58] Updating instance_info_cache with network_info: [{"id": "92977e32-ac33-400b-a6d8-cd7726581aff", "address": "fa:16:3e:98:2f:34", "network": {"id": "b9eff4e8-f1ca-450e-848a-97a956388f68", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.70", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "7386534ab8334806bb50f77e095e084c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ad72c645-a67d-4efd-b563-28e44077e68d", "external-id": "nsx-vlan-transportzone-201", "segmentation_id": 201, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap92977e32-ac", "ovs_interfaceid": "92977e32-ac33-400b-a6d8-cd7726581aff", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59369) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 615.291579] env[59369]: DEBUG nova.compute.manager [None req-4b489d12-c9ea-48cd-8cc1-de9ad8caab5d tempest-ServersAdminNegativeTestJSON-1608821687 tempest-ServersAdminNegativeTestJSON-1608821687-project-member] [instance: ad662e06-1a0f-4110-8d23-8ff6c6889eee] Start spawning the instance on the hypervisor. {{(pid=59369) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 615.295590] env[59369]: DEBUG oslo_concurrency.lockutils [req-4a503809-4858-4b63-931a-81ae4ed8aed0 req-2e42c0c8-51da-49eb-a49c-e9f062bcc081 service nova] Releasing lock "refresh_cache-d501202a-354c-42e1-8480-f026d5216a58" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 615.297224] env[59369]: DEBUG oslo_service.periodic_task [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59369) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 615.297749] env[59369]: DEBUG oslo_service.periodic_task [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59369) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 615.297749] env[59369]: DEBUG nova.compute.manager [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Starting heal instance info cache {{(pid=59369) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9812}} [ 615.297749] env[59369]: DEBUG nova.compute.manager [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Rebuilding the list of instances to heal {{(pid=59369) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9816}} [ 615.314902] env[59369]: DEBUG nova.virt.hardware [None req-4b489d12-c9ea-48cd-8cc1-de9ad8caab5d tempest-ServersAdminNegativeTestJSON-1608821687 tempest-ServersAdminNegativeTestJSON-1608821687-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-08-29T11:50:58Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-08-29T11:50:41Z,direct_url=,disk_format='vmdk',id=dc3bad91-06a4-485e-bdc4-b0135384e686,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='7386534ab8334806bb50f77e095e084c',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-08-29T11:50:42Z,virtual_size=,visibility=), allow threads: False {{(pid=59369) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 615.315155] env[59369]: DEBUG nova.virt.hardware [None req-4b489d12-c9ea-48cd-8cc1-de9ad8caab5d tempest-ServersAdminNegativeTestJSON-1608821687 tempest-ServersAdminNegativeTestJSON-1608821687-project-member] Flavor limits 0:0:0 {{(pid=59369) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 615.315352] env[59369]: DEBUG nova.virt.hardware [None req-4b489d12-c9ea-48cd-8cc1-de9ad8caab5d tempest-ServersAdminNegativeTestJSON-1608821687 tempest-ServersAdminNegativeTestJSON-1608821687-project-member] Image limits 0:0:0 {{(pid=59369) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 615.315547] env[59369]: DEBUG nova.virt.hardware [None req-4b489d12-c9ea-48cd-8cc1-de9ad8caab5d tempest-ServersAdminNegativeTestJSON-1608821687 tempest-ServersAdminNegativeTestJSON-1608821687-project-member] Flavor pref 0:0:0 {{(pid=59369) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 615.315690] env[59369]: DEBUG nova.virt.hardware [None req-4b489d12-c9ea-48cd-8cc1-de9ad8caab5d tempest-ServersAdminNegativeTestJSON-1608821687 tempest-ServersAdminNegativeTestJSON-1608821687-project-member] Image pref 0:0:0 {{(pid=59369) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 615.315828] env[59369]: DEBUG nova.virt.hardware [None req-4b489d12-c9ea-48cd-8cc1-de9ad8caab5d tempest-ServersAdminNegativeTestJSON-1608821687 tempest-ServersAdminNegativeTestJSON-1608821687-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59369) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 615.316703] env[59369]: DEBUG nova.virt.hardware [None req-4b489d12-c9ea-48cd-8cc1-de9ad8caab5d tempest-ServersAdminNegativeTestJSON-1608821687 tempest-ServersAdminNegativeTestJSON-1608821687-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59369) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 615.316939] env[59369]: DEBUG nova.virt.hardware [None req-4b489d12-c9ea-48cd-8cc1-de9ad8caab5d tempest-ServersAdminNegativeTestJSON-1608821687 tempest-ServersAdminNegativeTestJSON-1608821687-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59369) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 615.317143] env[59369]: DEBUG nova.virt.hardware [None req-4b489d12-c9ea-48cd-8cc1-de9ad8caab5d tempest-ServersAdminNegativeTestJSON-1608821687 tempest-ServersAdminNegativeTestJSON-1608821687-project-member] Got 1 possible topologies {{(pid=59369) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 615.317370] env[59369]: DEBUG nova.virt.hardware [None req-4b489d12-c9ea-48cd-8cc1-de9ad8caab5d tempest-ServersAdminNegativeTestJSON-1608821687 tempest-ServersAdminNegativeTestJSON-1608821687-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59369) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 615.317485] env[59369]: DEBUG nova.virt.hardware [None req-4b489d12-c9ea-48cd-8cc1-de9ad8caab5d tempest-ServersAdminNegativeTestJSON-1608821687 tempest-ServersAdminNegativeTestJSON-1608821687-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59369) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 615.319262] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a05ad6f0-171a-4be4-835c-9b333ef26729 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 615.323715] env[59369]: DEBUG nova.compute.manager [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] Skipping network cache update for instance because it is Building. {{(pid=59369) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 615.323892] env[59369]: DEBUG nova.compute.manager [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] [instance: 31c17493-263e-4d08-b669-e901b07060d5] Skipping network cache update for instance because it is Building. {{(pid=59369) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 615.324042] env[59369]: DEBUG nova.compute.manager [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] Skipping network cache update for instance because it is Building. {{(pid=59369) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 615.324178] env[59369]: DEBUG nova.compute.manager [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] [instance: eae87f89-8488-42e6-b065-1198bfbe8177] Skipping network cache update for instance because it is Building. {{(pid=59369) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 615.324301] env[59369]: DEBUG nova.compute.manager [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] [instance: d501202a-354c-42e1-8480-f026d5216a58] Skipping network cache update for instance because it is Building. {{(pid=59369) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 615.324945] env[59369]: DEBUG nova.compute.manager [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] [instance: ad662e06-1a0f-4110-8d23-8ff6c6889eee] Skipping network cache update for instance because it is Building. {{(pid=59369) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 615.325131] env[59369]: DEBUG nova.compute.manager [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Didn't find any instances for network info cache update. {{(pid=59369) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9898}} [ 615.325797] env[59369]: DEBUG oslo_service.periodic_task [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59369) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 615.326194] env[59369]: DEBUG oslo_service.periodic_task [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59369) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 615.326398] env[59369]: DEBUG oslo_service.periodic_task [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59369) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 615.326607] env[59369]: DEBUG oslo_service.periodic_task [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59369) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 615.326790] env[59369]: DEBUG oslo_service.periodic_task [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59369) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 615.326967] env[59369]: DEBUG oslo_service.periodic_task [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59369) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 615.327151] env[59369]: DEBUG nova.compute.manager [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59369) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10431}} [ 615.327302] env[59369]: DEBUG oslo_service.periodic_task [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59369) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 615.331602] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6defeb3e-df7e-4e5f-ab67-60b834f28acb {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 615.339157] env[59369]: DEBUG oslo_concurrency.lockutils [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 615.339379] env[59369]: DEBUG oslo_concurrency.lockutils [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 615.339534] env[59369]: DEBUG oslo_concurrency.lockutils [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 615.339674] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59369) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 615.341012] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4e8f653a-8739-47c4-b1a9-9330748e529d {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 615.357712] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-70164686-81f6-4f47-a799-57b75269c58d {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 615.372929] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-505b3f9f-6e6b-4ceb-8458-1e0e135a3316 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 615.379900] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b883b01c-6ff8-45b5-909c-3d87c24ccd6f {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 615.410018] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181827MB free_disk=116GB free_vcpus=48 pci_devices=None {{(pid=59369) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 615.410179] env[59369]: DEBUG oslo_concurrency.lockutils [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 615.411969] env[59369]: DEBUG oslo_concurrency.lockutils [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 615.413184] env[59369]: DEBUG nova.policy [None req-4b489d12-c9ea-48cd-8cc1-de9ad8caab5d tempest-ServersAdminNegativeTestJSON-1608821687 tempest-ServersAdminNegativeTestJSON-1608821687-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4917736f61a846e192878e2375e54a68', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '455b57fccb804ae8a1ef877d9ac0e873', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59369) authorize /opt/stack/nova/nova/policy.py:203}} [ 615.466569] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Instance 02929ad5-b2f4-4a80-a606-2a0c9b6222e4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59369) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 615.466827] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Instance 31c17493-263e-4d08-b669-e901b07060d5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59369) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 615.466827] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Instance b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59369) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 615.467200] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Instance eae87f89-8488-42e6-b065-1198bfbe8177 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59369) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 615.467200] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Instance d501202a-354c-42e1-8480-f026d5216a58 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59369) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 615.467200] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Instance ad662e06-1a0f-4110-8d23-8ff6c6889eee actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59369) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 615.467343] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Total usable vcpus: 48, total allocated vcpus: 6 {{(pid=59369) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 615.467537] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1280MB phys_disk=200GB used_disk=6GB total_vcpus=48 used_vcpus=6 pci_stats=[] {{(pid=59369) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 615.659975] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e8190a4f-9517-41e2-9bf3-e53ee4be80a8 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 615.667792] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-88a9eab8-4e5d-48d9-afeb-89bc15880f6d {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 615.699506] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78e01080-72b4-4fd0-abd5-cb2edbc75340 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 615.707340] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6c93de19-1f5e-46d7-8a88-ae8328578f41 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 615.721433] env[59369]: DEBUG nova.compute.provider_tree [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Inventory has not changed in ProviderTree for provider: 6ace6145-3535-4f74-aa29-80f64a201369 {{(pid=59369) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 615.737457] env[59369]: DEBUG nova.scheduler.client.report [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Inventory has not changed for provider 6ace6145-3535-4f74-aa29-80f64a201369 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 116, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59369) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 615.756777] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59369) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 615.756939] env[59369]: DEBUG oslo_concurrency.lockutils [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.347s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 616.530067] env[59369]: DEBUG nova.network.neutron [None req-4b489d12-c9ea-48cd-8cc1-de9ad8caab5d tempest-ServersAdminNegativeTestJSON-1608821687 tempest-ServersAdminNegativeTestJSON-1608821687-project-member] [instance: ad662e06-1a0f-4110-8d23-8ff6c6889eee] Successfully created port: c4b18c68-0b74-48de-8e67-aeda476d896a {{(pid=59369) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 618.426541] env[59369]: DEBUG nova.network.neutron [None req-4b489d12-c9ea-48cd-8cc1-de9ad8caab5d tempest-ServersAdminNegativeTestJSON-1608821687 tempest-ServersAdminNegativeTestJSON-1608821687-project-member] [instance: ad662e06-1a0f-4110-8d23-8ff6c6889eee] Successfully updated port: c4b18c68-0b74-48de-8e67-aeda476d896a {{(pid=59369) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 618.436316] env[59369]: DEBUG oslo_concurrency.lockutils [None req-4b489d12-c9ea-48cd-8cc1-de9ad8caab5d tempest-ServersAdminNegativeTestJSON-1608821687 tempest-ServersAdminNegativeTestJSON-1608821687-project-member] Acquiring lock "refresh_cache-ad662e06-1a0f-4110-8d23-8ff6c6889eee" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 618.436316] env[59369]: DEBUG oslo_concurrency.lockutils [None req-4b489d12-c9ea-48cd-8cc1-de9ad8caab5d tempest-ServersAdminNegativeTestJSON-1608821687 tempest-ServersAdminNegativeTestJSON-1608821687-project-member] Acquired lock "refresh_cache-ad662e06-1a0f-4110-8d23-8ff6c6889eee" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 618.436443] env[59369]: DEBUG nova.network.neutron [None req-4b489d12-c9ea-48cd-8cc1-de9ad8caab5d tempest-ServersAdminNegativeTestJSON-1608821687 tempest-ServersAdminNegativeTestJSON-1608821687-project-member] [instance: ad662e06-1a0f-4110-8d23-8ff6c6889eee] Building network info cache for instance {{(pid=59369) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 618.577136] env[59369]: DEBUG nova.network.neutron [None req-4b489d12-c9ea-48cd-8cc1-de9ad8caab5d tempest-ServersAdminNegativeTestJSON-1608821687 tempest-ServersAdminNegativeTestJSON-1608821687-project-member] [instance: ad662e06-1a0f-4110-8d23-8ff6c6889eee] Instance cache missing network info. {{(pid=59369) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 618.912277] env[59369]: DEBUG oslo_concurrency.lockutils [None req-dd4873a7-db8d-4576-bfb8-ebecfcdc0324 tempest-VolumesAssistedSnapshotsTest-825471597 tempest-VolumesAssistedSnapshotsTest-825471597-project-member] Acquiring lock "3089e10b-f9fd-4049-b8f4-9297fe6a7c86" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 618.912490] env[59369]: DEBUG oslo_concurrency.lockutils [None req-dd4873a7-db8d-4576-bfb8-ebecfcdc0324 tempest-VolumesAssistedSnapshotsTest-825471597 tempest-VolumesAssistedSnapshotsTest-825471597-project-member] Lock "3089e10b-f9fd-4049-b8f4-9297fe6a7c86" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 618.922795] env[59369]: DEBUG nova.compute.manager [None req-dd4873a7-db8d-4576-bfb8-ebecfcdc0324 tempest-VolumesAssistedSnapshotsTest-825471597 tempest-VolumesAssistedSnapshotsTest-825471597-project-member] [instance: 3089e10b-f9fd-4049-b8f4-9297fe6a7c86] Starting instance... {{(pid=59369) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 618.981316] env[59369]: DEBUG oslo_concurrency.lockutils [None req-dd4873a7-db8d-4576-bfb8-ebecfcdc0324 tempest-VolumesAssistedSnapshotsTest-825471597 tempest-VolumesAssistedSnapshotsTest-825471597-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 618.981544] env[59369]: DEBUG oslo_concurrency.lockutils [None req-dd4873a7-db8d-4576-bfb8-ebecfcdc0324 tempest-VolumesAssistedSnapshotsTest-825471597 tempest-VolumesAssistedSnapshotsTest-825471597-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 618.983488] env[59369]: INFO nova.compute.claims [None req-dd4873a7-db8d-4576-bfb8-ebecfcdc0324 tempest-VolumesAssistedSnapshotsTest-825471597 tempest-VolumesAssistedSnapshotsTest-825471597-project-member] [instance: 3089e10b-f9fd-4049-b8f4-9297fe6a7c86] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 619.165062] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d0b49a1e-4f61-4acf-9d6a-2de19ace27e2 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 619.172696] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-01f93f60-3b6b-4663-96b2-b4c8461ba026 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 619.209780] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-371f0858-e53d-4c34-b288-4a0c376c2828 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 619.217333] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4b060e06-9311-4b5e-875f-9b20f9d0abe4 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 619.235134] env[59369]: DEBUG nova.compute.provider_tree [None req-dd4873a7-db8d-4576-bfb8-ebecfcdc0324 tempest-VolumesAssistedSnapshotsTest-825471597 tempest-VolumesAssistedSnapshotsTest-825471597-project-member] Inventory has not changed in ProviderTree for provider: 6ace6145-3535-4f74-aa29-80f64a201369 {{(pid=59369) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 619.245075] env[59369]: DEBUG nova.scheduler.client.report [None req-dd4873a7-db8d-4576-bfb8-ebecfcdc0324 tempest-VolumesAssistedSnapshotsTest-825471597 tempest-VolumesAssistedSnapshotsTest-825471597-project-member] Inventory has not changed for provider 6ace6145-3535-4f74-aa29-80f64a201369 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 116, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59369) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 619.263564] env[59369]: DEBUG oslo_concurrency.lockutils [None req-dd4873a7-db8d-4576-bfb8-ebecfcdc0324 tempest-VolumesAssistedSnapshotsTest-825471597 tempest-VolumesAssistedSnapshotsTest-825471597-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.282s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 619.265037] env[59369]: DEBUG nova.compute.manager [None req-dd4873a7-db8d-4576-bfb8-ebecfcdc0324 tempest-VolumesAssistedSnapshotsTest-825471597 tempest-VolumesAssistedSnapshotsTest-825471597-project-member] [instance: 3089e10b-f9fd-4049-b8f4-9297fe6a7c86] Start building networks asynchronously for instance. {{(pid=59369) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 619.295484] env[59369]: DEBUG nova.network.neutron [None req-4b489d12-c9ea-48cd-8cc1-de9ad8caab5d tempest-ServersAdminNegativeTestJSON-1608821687 tempest-ServersAdminNegativeTestJSON-1608821687-project-member] [instance: ad662e06-1a0f-4110-8d23-8ff6c6889eee] Updating instance_info_cache with network_info: [{"id": "c4b18c68-0b74-48de-8e67-aeda476d896a", "address": "fa:16:3e:9b:17:45", "network": {"id": "b9eff4e8-f1ca-450e-848a-97a956388f68", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.63", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "7386534ab8334806bb50f77e095e084c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ad72c645-a67d-4efd-b563-28e44077e68d", "external-id": "nsx-vlan-transportzone-201", "segmentation_id": 201, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc4b18c68-0b", "ovs_interfaceid": "c4b18c68-0b74-48de-8e67-aeda476d896a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59369) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 619.310955] env[59369]: DEBUG oslo_concurrency.lockutils [None req-4b489d12-c9ea-48cd-8cc1-de9ad8caab5d tempest-ServersAdminNegativeTestJSON-1608821687 tempest-ServersAdminNegativeTestJSON-1608821687-project-member] Releasing lock "refresh_cache-ad662e06-1a0f-4110-8d23-8ff6c6889eee" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 619.311593] env[59369]: DEBUG nova.compute.manager [None req-4b489d12-c9ea-48cd-8cc1-de9ad8caab5d tempest-ServersAdminNegativeTestJSON-1608821687 tempest-ServersAdminNegativeTestJSON-1608821687-project-member] [instance: ad662e06-1a0f-4110-8d23-8ff6c6889eee] Instance network_info: |[{"id": "c4b18c68-0b74-48de-8e67-aeda476d896a", "address": "fa:16:3e:9b:17:45", "network": {"id": "b9eff4e8-f1ca-450e-848a-97a956388f68", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.63", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "7386534ab8334806bb50f77e095e084c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ad72c645-a67d-4efd-b563-28e44077e68d", "external-id": "nsx-vlan-transportzone-201", "segmentation_id": 201, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc4b18c68-0b", "ovs_interfaceid": "c4b18c68-0b74-48de-8e67-aeda476d896a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59369) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 619.312526] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-4b489d12-c9ea-48cd-8cc1-de9ad8caab5d tempest-ServersAdminNegativeTestJSON-1608821687 tempest-ServersAdminNegativeTestJSON-1608821687-project-member] [instance: ad662e06-1a0f-4110-8d23-8ff6c6889eee] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:9b:17:45', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ad72c645-a67d-4efd-b563-28e44077e68d', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'c4b18c68-0b74-48de-8e67-aeda476d896a', 'vif_model': 'vmxnet3'}] {{(pid=59369) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 619.323492] env[59369]: DEBUG nova.virt.vmwareapi.vm_util [None req-4b489d12-c9ea-48cd-8cc1-de9ad8caab5d tempest-ServersAdminNegativeTestJSON-1608821687 tempest-ServersAdminNegativeTestJSON-1608821687-project-member] Creating folder: Project (455b57fccb804ae8a1ef877d9ac0e873). Parent ref: group-v121837. {{(pid=59369) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 619.327035] env[59369]: DEBUG nova.compute.utils [None req-dd4873a7-db8d-4576-bfb8-ebecfcdc0324 tempest-VolumesAssistedSnapshotsTest-825471597 tempest-VolumesAssistedSnapshotsTest-825471597-project-member] Using /dev/sd instead of None {{(pid=59369) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 619.327204] env[59369]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-61e74081-4040-4a1e-9961-0fcd76186c68 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 619.329079] env[59369]: DEBUG nova.compute.manager [None req-dd4873a7-db8d-4576-bfb8-ebecfcdc0324 tempest-VolumesAssistedSnapshotsTest-825471597 tempest-VolumesAssistedSnapshotsTest-825471597-project-member] [instance: 3089e10b-f9fd-4049-b8f4-9297fe6a7c86] Allocating IP information in the background. {{(pid=59369) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 619.329248] env[59369]: DEBUG nova.network.neutron [None req-dd4873a7-db8d-4576-bfb8-ebecfcdc0324 tempest-VolumesAssistedSnapshotsTest-825471597 tempest-VolumesAssistedSnapshotsTest-825471597-project-member] [instance: 3089e10b-f9fd-4049-b8f4-9297fe6a7c86] allocate_for_instance() {{(pid=59369) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 619.337322] env[59369]: DEBUG nova.compute.manager [None req-dd4873a7-db8d-4576-bfb8-ebecfcdc0324 tempest-VolumesAssistedSnapshotsTest-825471597 tempest-VolumesAssistedSnapshotsTest-825471597-project-member] [instance: 3089e10b-f9fd-4049-b8f4-9297fe6a7c86] Start building block device mappings for instance. {{(pid=59369) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 619.344305] env[59369]: INFO nova.virt.vmwareapi.vm_util [None req-4b489d12-c9ea-48cd-8cc1-de9ad8caab5d tempest-ServersAdminNegativeTestJSON-1608821687 tempest-ServersAdminNegativeTestJSON-1608821687-project-member] Created folder: Project (455b57fccb804ae8a1ef877d9ac0e873) in parent group-v121837. [ 619.347406] env[59369]: DEBUG nova.virt.vmwareapi.vm_util [None req-4b489d12-c9ea-48cd-8cc1-de9ad8caab5d tempest-ServersAdminNegativeTestJSON-1608821687 tempest-ServersAdminNegativeTestJSON-1608821687-project-member] Creating folder: Instances. Parent ref: group-v121853. {{(pid=59369) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 619.348250] env[59369]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-0fd54c5e-6175-4872-9d47-5958beb84a0b {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 619.358280] env[59369]: INFO nova.virt.vmwareapi.vm_util [None req-4b489d12-c9ea-48cd-8cc1-de9ad8caab5d tempest-ServersAdminNegativeTestJSON-1608821687 tempest-ServersAdminNegativeTestJSON-1608821687-project-member] Created folder: Instances in parent group-v121853. [ 619.358508] env[59369]: DEBUG oslo.service.loopingcall [None req-4b489d12-c9ea-48cd-8cc1-de9ad8caab5d tempest-ServersAdminNegativeTestJSON-1608821687 tempest-ServersAdminNegativeTestJSON-1608821687-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59369) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 619.358684] env[59369]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ad662e06-1a0f-4110-8d23-8ff6c6889eee] Creating VM on the ESX host {{(pid=59369) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 619.358872] env[59369]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-1b3e5077-1f62-4df3-936f-87846300fc2e {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 619.388939] env[59369]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 619.388939] env[59369]: value = "task-463238" [ 619.388939] env[59369]: _type = "Task" [ 619.388939] env[59369]: } to complete. {{(pid=59369) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 619.399221] env[59369]: DEBUG oslo_vmware.api [-] Task: {'id': task-463238, 'name': CreateVM_Task} progress is 6%. {{(pid=59369) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 619.443024] env[59369]: DEBUG nova.compute.manager [None req-dd4873a7-db8d-4576-bfb8-ebecfcdc0324 tempest-VolumesAssistedSnapshotsTest-825471597 tempest-VolumesAssistedSnapshotsTest-825471597-project-member] [instance: 3089e10b-f9fd-4049-b8f4-9297fe6a7c86] Start spawning the instance on the hypervisor. {{(pid=59369) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 619.466656] env[59369]: DEBUG nova.virt.hardware [None req-dd4873a7-db8d-4576-bfb8-ebecfcdc0324 tempest-VolumesAssistedSnapshotsTest-825471597 tempest-VolumesAssistedSnapshotsTest-825471597-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-08-29T11:50:58Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-08-29T11:50:41Z,direct_url=,disk_format='vmdk',id=dc3bad91-06a4-485e-bdc4-b0135384e686,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='7386534ab8334806bb50f77e095e084c',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-08-29T11:50:42Z,virtual_size=,visibility=), allow threads: False {{(pid=59369) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 619.466656] env[59369]: DEBUG nova.virt.hardware [None req-dd4873a7-db8d-4576-bfb8-ebecfcdc0324 tempest-VolumesAssistedSnapshotsTest-825471597 tempest-VolumesAssistedSnapshotsTest-825471597-project-member] Flavor limits 0:0:0 {{(pid=59369) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 619.466770] env[59369]: DEBUG nova.virt.hardware [None req-dd4873a7-db8d-4576-bfb8-ebecfcdc0324 tempest-VolumesAssistedSnapshotsTest-825471597 tempest-VolumesAssistedSnapshotsTest-825471597-project-member] Image limits 0:0:0 {{(pid=59369) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 619.466915] env[59369]: DEBUG nova.virt.hardware [None req-dd4873a7-db8d-4576-bfb8-ebecfcdc0324 tempest-VolumesAssistedSnapshotsTest-825471597 tempest-VolumesAssistedSnapshotsTest-825471597-project-member] Flavor pref 0:0:0 {{(pid=59369) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 619.467061] env[59369]: DEBUG nova.virt.hardware [None req-dd4873a7-db8d-4576-bfb8-ebecfcdc0324 tempest-VolumesAssistedSnapshotsTest-825471597 tempest-VolumesAssistedSnapshotsTest-825471597-project-member] Image pref 0:0:0 {{(pid=59369) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 619.467199] env[59369]: DEBUG nova.virt.hardware [None req-dd4873a7-db8d-4576-bfb8-ebecfcdc0324 tempest-VolumesAssistedSnapshotsTest-825471597 tempest-VolumesAssistedSnapshotsTest-825471597-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59369) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 619.467395] env[59369]: DEBUG nova.virt.hardware [None req-dd4873a7-db8d-4576-bfb8-ebecfcdc0324 tempest-VolumesAssistedSnapshotsTest-825471597 tempest-VolumesAssistedSnapshotsTest-825471597-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59369) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 619.467538] env[59369]: DEBUG nova.virt.hardware [None req-dd4873a7-db8d-4576-bfb8-ebecfcdc0324 tempest-VolumesAssistedSnapshotsTest-825471597 tempest-VolumesAssistedSnapshotsTest-825471597-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59369) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 619.467693] env[59369]: DEBUG nova.virt.hardware [None req-dd4873a7-db8d-4576-bfb8-ebecfcdc0324 tempest-VolumesAssistedSnapshotsTest-825471597 tempest-VolumesAssistedSnapshotsTest-825471597-project-member] Got 1 possible topologies {{(pid=59369) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 619.467841] env[59369]: DEBUG nova.virt.hardware [None req-dd4873a7-db8d-4576-bfb8-ebecfcdc0324 tempest-VolumesAssistedSnapshotsTest-825471597 tempest-VolumesAssistedSnapshotsTest-825471597-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59369) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 619.467996] env[59369]: DEBUG nova.virt.hardware [None req-dd4873a7-db8d-4576-bfb8-ebecfcdc0324 tempest-VolumesAssistedSnapshotsTest-825471597 tempest-VolumesAssistedSnapshotsTest-825471597-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59369) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 619.469323] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bdfd62d3-80cd-4634-90ae-bf37f1cecce7 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 619.479830] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d5ba33cd-3d7b-48dd-bdec-aeb2b47337ae {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 619.514262] env[59369]: DEBUG nova.policy [None req-dd4873a7-db8d-4576-bfb8-ebecfcdc0324 tempest-VolumesAssistedSnapshotsTest-825471597 tempest-VolumesAssistedSnapshotsTest-825471597-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'aedeb9f10a7c4eee8ed4c152f749ec94', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3d67fc0b199b4807ba31fa60831085aa', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59369) authorize /opt/stack/nova/nova/policy.py:203}} [ 619.900268] env[59369]: DEBUG oslo_vmware.api [-] Task: {'id': task-463238, 'name': CreateVM_Task, 'duration_secs': 0.306272} completed successfully. {{(pid=59369) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 619.900433] env[59369]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ad662e06-1a0f-4110-8d23-8ff6c6889eee] Created VM on the ESX host {{(pid=59369) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 619.901365] env[59369]: DEBUG oslo_concurrency.lockutils [None req-4b489d12-c9ea-48cd-8cc1-de9ad8caab5d tempest-ServersAdminNegativeTestJSON-1608821687 tempest-ServersAdminNegativeTestJSON-1608821687-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/dc3bad91-06a4-485e-bdc4-b0135384e686" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 619.901521] env[59369]: DEBUG oslo_concurrency.lockutils [None req-4b489d12-c9ea-48cd-8cc1-de9ad8caab5d tempest-ServersAdminNegativeTestJSON-1608821687 tempest-ServersAdminNegativeTestJSON-1608821687-project-member] Acquired lock "[datastore1] devstack-image-cache_base/dc3bad91-06a4-485e-bdc4-b0135384e686" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 619.901834] env[59369]: DEBUG oslo_concurrency.lockutils [None req-4b489d12-c9ea-48cd-8cc1-de9ad8caab5d tempest-ServersAdminNegativeTestJSON-1608821687 tempest-ServersAdminNegativeTestJSON-1608821687-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/dc3bad91-06a4-485e-bdc4-b0135384e686" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 619.902085] env[59369]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-755215de-39af-49f4-9d3f-7798003d72ce {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 619.910400] env[59369]: DEBUG oslo_vmware.api [None req-4b489d12-c9ea-48cd-8cc1-de9ad8caab5d tempest-ServersAdminNegativeTestJSON-1608821687 tempest-ServersAdminNegativeTestJSON-1608821687-project-member] Waiting for the task: (returnval){ [ 619.910400] env[59369]: value = "session[52794835-9974-5e05-43d4-e917bd2adbc9]52061be4-26a2-4ddd-95e7-bf40e561cba6" [ 619.910400] env[59369]: _type = "Task" [ 619.910400] env[59369]: } to complete. {{(pid=59369) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 619.916527] env[59369]: DEBUG oslo_vmware.api [None req-4b489d12-c9ea-48cd-8cc1-de9ad8caab5d tempest-ServersAdminNegativeTestJSON-1608821687 tempest-ServersAdminNegativeTestJSON-1608821687-project-member] Task: {'id': session[52794835-9974-5e05-43d4-e917bd2adbc9]52061be4-26a2-4ddd-95e7-bf40e561cba6, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59369) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 620.419284] env[59369]: DEBUG oslo_concurrency.lockutils [None req-4b489d12-c9ea-48cd-8cc1-de9ad8caab5d tempest-ServersAdminNegativeTestJSON-1608821687 tempest-ServersAdminNegativeTestJSON-1608821687-project-member] Releasing lock "[datastore1] devstack-image-cache_base/dc3bad91-06a4-485e-bdc4-b0135384e686" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 620.419538] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-4b489d12-c9ea-48cd-8cc1-de9ad8caab5d tempest-ServersAdminNegativeTestJSON-1608821687 tempest-ServersAdminNegativeTestJSON-1608821687-project-member] [instance: ad662e06-1a0f-4110-8d23-8ff6c6889eee] Processing image dc3bad91-06a4-485e-bdc4-b0135384e686 {{(pid=59369) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 620.419747] env[59369]: DEBUG oslo_concurrency.lockutils [None req-4b489d12-c9ea-48cd-8cc1-de9ad8caab5d tempest-ServersAdminNegativeTestJSON-1608821687 tempest-ServersAdminNegativeTestJSON-1608821687-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/dc3bad91-06a4-485e-bdc4-b0135384e686/dc3bad91-06a4-485e-bdc4-b0135384e686.vmdk" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 620.902331] env[59369]: DEBUG nova.network.neutron [None req-dd4873a7-db8d-4576-bfb8-ebecfcdc0324 tempest-VolumesAssistedSnapshotsTest-825471597 tempest-VolumesAssistedSnapshotsTest-825471597-project-member] [instance: 3089e10b-f9fd-4049-b8f4-9297fe6a7c86] Successfully created port: f065c1ed-1c94-4399-98e7-1137b20994b2 {{(pid=59369) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 621.191023] env[59369]: DEBUG nova.compute.manager [req-518d40eb-8834-4cb4-98d6-114c218a4338 req-2feb7130-f3fb-4cd5-8e26-73a4db4ab411 service nova] [instance: ad662e06-1a0f-4110-8d23-8ff6c6889eee] Received event network-vif-plugged-c4b18c68-0b74-48de-8e67-aeda476d896a {{(pid=59369) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 621.191270] env[59369]: DEBUG oslo_concurrency.lockutils [req-518d40eb-8834-4cb4-98d6-114c218a4338 req-2feb7130-f3fb-4cd5-8e26-73a4db4ab411 service nova] Acquiring lock "ad662e06-1a0f-4110-8d23-8ff6c6889eee-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 621.191940] env[59369]: DEBUG oslo_concurrency.lockutils [req-518d40eb-8834-4cb4-98d6-114c218a4338 req-2feb7130-f3fb-4cd5-8e26-73a4db4ab411 service nova] Lock "ad662e06-1a0f-4110-8d23-8ff6c6889eee-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 621.191940] env[59369]: DEBUG oslo_concurrency.lockutils [req-518d40eb-8834-4cb4-98d6-114c218a4338 req-2feb7130-f3fb-4cd5-8e26-73a4db4ab411 service nova] Lock "ad662e06-1a0f-4110-8d23-8ff6c6889eee-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 621.191940] env[59369]: DEBUG nova.compute.manager [req-518d40eb-8834-4cb4-98d6-114c218a4338 req-2feb7130-f3fb-4cd5-8e26-73a4db4ab411 service nova] [instance: ad662e06-1a0f-4110-8d23-8ff6c6889eee] No waiting events found dispatching network-vif-plugged-c4b18c68-0b74-48de-8e67-aeda476d896a {{(pid=59369) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 621.195156] env[59369]: WARNING nova.compute.manager [req-518d40eb-8834-4cb4-98d6-114c218a4338 req-2feb7130-f3fb-4cd5-8e26-73a4db4ab411 service nova] [instance: ad662e06-1a0f-4110-8d23-8ff6c6889eee] Received unexpected event network-vif-plugged-c4b18c68-0b74-48de-8e67-aeda476d896a for instance with vm_state building and task_state spawning. [ 622.429500] env[59369]: DEBUG oslo_concurrency.lockutils [None req-4e3b8987-3580-41ec-86db-2724b58f23fe tempest-TenantUsagesTestJSON-1555548295 tempest-TenantUsagesTestJSON-1555548295-project-member] Acquiring lock "3117e247-0538-4a30-a0d7-aa47247a6da1" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 622.429798] env[59369]: DEBUG oslo_concurrency.lockutils [None req-4e3b8987-3580-41ec-86db-2724b58f23fe tempest-TenantUsagesTestJSON-1555548295 tempest-TenantUsagesTestJSON-1555548295-project-member] Lock "3117e247-0538-4a30-a0d7-aa47247a6da1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 622.442182] env[59369]: DEBUG nova.compute.manager [None req-4e3b8987-3580-41ec-86db-2724b58f23fe tempest-TenantUsagesTestJSON-1555548295 tempest-TenantUsagesTestJSON-1555548295-project-member] [instance: 3117e247-0538-4a30-a0d7-aa47247a6da1] Starting instance... {{(pid=59369) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 622.505053] env[59369]: DEBUG oslo_concurrency.lockutils [None req-4e3b8987-3580-41ec-86db-2724b58f23fe tempest-TenantUsagesTestJSON-1555548295 tempest-TenantUsagesTestJSON-1555548295-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 622.505053] env[59369]: DEBUG oslo_concurrency.lockutils [None req-4e3b8987-3580-41ec-86db-2724b58f23fe tempest-TenantUsagesTestJSON-1555548295 tempest-TenantUsagesTestJSON-1555548295-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 622.506568] env[59369]: INFO nova.compute.claims [None req-4e3b8987-3580-41ec-86db-2724b58f23fe tempest-TenantUsagesTestJSON-1555548295 tempest-TenantUsagesTestJSON-1555548295-project-member] [instance: 3117e247-0538-4a30-a0d7-aa47247a6da1] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 622.712479] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-45cc4765-a74d-431b-a08c-ffb1da487ffe {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 622.720308] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c3867027-3889-46b9-a10e-b4181f1dda40 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 622.752734] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0b930380-703a-4860-b6d7-5da190243172 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 622.761324] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a20cd8eb-89b2-40ec-a040-a6df69e446c9 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 622.774917] env[59369]: DEBUG nova.compute.provider_tree [None req-4e3b8987-3580-41ec-86db-2724b58f23fe tempest-TenantUsagesTestJSON-1555548295 tempest-TenantUsagesTestJSON-1555548295-project-member] Inventory has not changed in ProviderTree for provider: 6ace6145-3535-4f74-aa29-80f64a201369 {{(pid=59369) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 622.792124] env[59369]: DEBUG nova.scheduler.client.report [None req-4e3b8987-3580-41ec-86db-2724b58f23fe tempest-TenantUsagesTestJSON-1555548295 tempest-TenantUsagesTestJSON-1555548295-project-member] Inventory has not changed for provider 6ace6145-3535-4f74-aa29-80f64a201369 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 116, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59369) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 622.816020] env[59369]: DEBUG oslo_concurrency.lockutils [None req-4e3b8987-3580-41ec-86db-2724b58f23fe tempest-TenantUsagesTestJSON-1555548295 tempest-TenantUsagesTestJSON-1555548295-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.312s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 622.816834] env[59369]: DEBUG nova.compute.manager [None req-4e3b8987-3580-41ec-86db-2724b58f23fe tempest-TenantUsagesTestJSON-1555548295 tempest-TenantUsagesTestJSON-1555548295-project-member] [instance: 3117e247-0538-4a30-a0d7-aa47247a6da1] Start building networks asynchronously for instance. {{(pid=59369) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 622.854518] env[59369]: DEBUG nova.compute.utils [None req-4e3b8987-3580-41ec-86db-2724b58f23fe tempest-TenantUsagesTestJSON-1555548295 tempest-TenantUsagesTestJSON-1555548295-project-member] Using /dev/sd instead of None {{(pid=59369) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 622.855501] env[59369]: DEBUG nova.compute.manager [None req-4e3b8987-3580-41ec-86db-2724b58f23fe tempest-TenantUsagesTestJSON-1555548295 tempest-TenantUsagesTestJSON-1555548295-project-member] [instance: 3117e247-0538-4a30-a0d7-aa47247a6da1] Allocating IP information in the background. {{(pid=59369) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 622.856178] env[59369]: DEBUG nova.network.neutron [None req-4e3b8987-3580-41ec-86db-2724b58f23fe tempest-TenantUsagesTestJSON-1555548295 tempest-TenantUsagesTestJSON-1555548295-project-member] [instance: 3117e247-0538-4a30-a0d7-aa47247a6da1] allocate_for_instance() {{(pid=59369) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 622.870396] env[59369]: DEBUG nova.compute.manager [None req-4e3b8987-3580-41ec-86db-2724b58f23fe tempest-TenantUsagesTestJSON-1555548295 tempest-TenantUsagesTestJSON-1555548295-project-member] [instance: 3117e247-0538-4a30-a0d7-aa47247a6da1] Start building block device mappings for instance. {{(pid=59369) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 622.954831] env[59369]: DEBUG nova.compute.manager [None req-4e3b8987-3580-41ec-86db-2724b58f23fe tempest-TenantUsagesTestJSON-1555548295 tempest-TenantUsagesTestJSON-1555548295-project-member] [instance: 3117e247-0538-4a30-a0d7-aa47247a6da1] Start spawning the instance on the hypervisor. {{(pid=59369) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 622.984461] env[59369]: DEBUG nova.virt.hardware [None req-4e3b8987-3580-41ec-86db-2724b58f23fe tempest-TenantUsagesTestJSON-1555548295 tempest-TenantUsagesTestJSON-1555548295-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-08-29T11:50:58Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-08-29T11:50:41Z,direct_url=,disk_format='vmdk',id=dc3bad91-06a4-485e-bdc4-b0135384e686,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='7386534ab8334806bb50f77e095e084c',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-08-29T11:50:42Z,virtual_size=,visibility=), allow threads: False {{(pid=59369) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 622.984709] env[59369]: DEBUG nova.virt.hardware [None req-4e3b8987-3580-41ec-86db-2724b58f23fe tempest-TenantUsagesTestJSON-1555548295 tempest-TenantUsagesTestJSON-1555548295-project-member] Flavor limits 0:0:0 {{(pid=59369) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 622.984859] env[59369]: DEBUG nova.virt.hardware [None req-4e3b8987-3580-41ec-86db-2724b58f23fe tempest-TenantUsagesTestJSON-1555548295 tempest-TenantUsagesTestJSON-1555548295-project-member] Image limits 0:0:0 {{(pid=59369) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 622.987248] env[59369]: DEBUG nova.virt.hardware [None req-4e3b8987-3580-41ec-86db-2724b58f23fe tempest-TenantUsagesTestJSON-1555548295 tempest-TenantUsagesTestJSON-1555548295-project-member] Flavor pref 0:0:0 {{(pid=59369) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 622.987487] env[59369]: DEBUG nova.virt.hardware [None req-4e3b8987-3580-41ec-86db-2724b58f23fe tempest-TenantUsagesTestJSON-1555548295 tempest-TenantUsagesTestJSON-1555548295-project-member] Image pref 0:0:0 {{(pid=59369) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 622.987647] env[59369]: DEBUG nova.virt.hardware [None req-4e3b8987-3580-41ec-86db-2724b58f23fe tempest-TenantUsagesTestJSON-1555548295 tempest-TenantUsagesTestJSON-1555548295-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59369) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 622.988018] env[59369]: DEBUG nova.virt.hardware [None req-4e3b8987-3580-41ec-86db-2724b58f23fe tempest-TenantUsagesTestJSON-1555548295 tempest-TenantUsagesTestJSON-1555548295-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59369) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 622.988018] env[59369]: DEBUG nova.virt.hardware [None req-4e3b8987-3580-41ec-86db-2724b58f23fe tempest-TenantUsagesTestJSON-1555548295 tempest-TenantUsagesTestJSON-1555548295-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59369) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 622.988208] env[59369]: DEBUG nova.virt.hardware [None req-4e3b8987-3580-41ec-86db-2724b58f23fe tempest-TenantUsagesTestJSON-1555548295 tempest-TenantUsagesTestJSON-1555548295-project-member] Got 1 possible topologies {{(pid=59369) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 622.988386] env[59369]: DEBUG nova.virt.hardware [None req-4e3b8987-3580-41ec-86db-2724b58f23fe tempest-TenantUsagesTestJSON-1555548295 tempest-TenantUsagesTestJSON-1555548295-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59369) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 622.988498] env[59369]: DEBUG nova.virt.hardware [None req-4e3b8987-3580-41ec-86db-2724b58f23fe tempest-TenantUsagesTestJSON-1555548295 tempest-TenantUsagesTestJSON-1555548295-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59369) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 622.991976] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-653b757d-599e-4c57-aa00-6b9a7c0e560d {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 623.002540] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-93780d92-93f2-4562-aa37-06e33b0f6b82 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 623.102633] env[59369]: DEBUG nova.policy [None req-4e3b8987-3580-41ec-86db-2724b58f23fe tempest-TenantUsagesTestJSON-1555548295 tempest-TenantUsagesTestJSON-1555548295-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '81c4c9653b314489a451b4ca0072f84e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '82808309113e47c481e8b7a9e958594a', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59369) authorize /opt/stack/nova/nova/policy.py:203}} [ 623.600185] env[59369]: DEBUG nova.network.neutron [None req-dd4873a7-db8d-4576-bfb8-ebecfcdc0324 tempest-VolumesAssistedSnapshotsTest-825471597 tempest-VolumesAssistedSnapshotsTest-825471597-project-member] [instance: 3089e10b-f9fd-4049-b8f4-9297fe6a7c86] Successfully updated port: f065c1ed-1c94-4399-98e7-1137b20994b2 {{(pid=59369) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 623.613765] env[59369]: DEBUG oslo_concurrency.lockutils [None req-dd4873a7-db8d-4576-bfb8-ebecfcdc0324 tempest-VolumesAssistedSnapshotsTest-825471597 tempest-VolumesAssistedSnapshotsTest-825471597-project-member] Acquiring lock "refresh_cache-3089e10b-f9fd-4049-b8f4-9297fe6a7c86" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 623.613908] env[59369]: DEBUG oslo_concurrency.lockutils [None req-dd4873a7-db8d-4576-bfb8-ebecfcdc0324 tempest-VolumesAssistedSnapshotsTest-825471597 tempest-VolumesAssistedSnapshotsTest-825471597-project-member] Acquired lock "refresh_cache-3089e10b-f9fd-4049-b8f4-9297fe6a7c86" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 623.614068] env[59369]: DEBUG nova.network.neutron [None req-dd4873a7-db8d-4576-bfb8-ebecfcdc0324 tempest-VolumesAssistedSnapshotsTest-825471597 tempest-VolumesAssistedSnapshotsTest-825471597-project-member] [instance: 3089e10b-f9fd-4049-b8f4-9297fe6a7c86] Building network info cache for instance {{(pid=59369) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 623.894728] env[59369]: DEBUG nova.network.neutron [None req-dd4873a7-db8d-4576-bfb8-ebecfcdc0324 tempest-VolumesAssistedSnapshotsTest-825471597 tempest-VolumesAssistedSnapshotsTest-825471597-project-member] [instance: 3089e10b-f9fd-4049-b8f4-9297fe6a7c86] Instance cache missing network info. {{(pid=59369) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 624.027524] env[59369]: DEBUG oslo_concurrency.lockutils [None req-eca9a096-3b3a-4327-a873-aed0a334bfc2 tempest-ServerRescueTestJSONUnderV235-793800253 tempest-ServerRescueTestJSONUnderV235-793800253-project-member] Acquiring lock "f2673a5e-28b0-4a93-b93b-8eef64380e08" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 624.027792] env[59369]: DEBUG oslo_concurrency.lockutils [None req-eca9a096-3b3a-4327-a873-aed0a334bfc2 tempest-ServerRescueTestJSONUnderV235-793800253 tempest-ServerRescueTestJSONUnderV235-793800253-project-member] Lock "f2673a5e-28b0-4a93-b93b-8eef64380e08" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 624.040036] env[59369]: DEBUG nova.compute.manager [None req-eca9a096-3b3a-4327-a873-aed0a334bfc2 tempest-ServerRescueTestJSONUnderV235-793800253 tempest-ServerRescueTestJSONUnderV235-793800253-project-member] [instance: f2673a5e-28b0-4a93-b93b-8eef64380e08] Starting instance... {{(pid=59369) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 624.110228] env[59369]: DEBUG oslo_concurrency.lockutils [None req-eca9a096-3b3a-4327-a873-aed0a334bfc2 tempest-ServerRescueTestJSONUnderV235-793800253 tempest-ServerRescueTestJSONUnderV235-793800253-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 624.110529] env[59369]: DEBUG oslo_concurrency.lockutils [None req-eca9a096-3b3a-4327-a873-aed0a334bfc2 tempest-ServerRescueTestJSONUnderV235-793800253 tempest-ServerRescueTestJSONUnderV235-793800253-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 624.112729] env[59369]: INFO nova.compute.claims [None req-eca9a096-3b3a-4327-a873-aed0a334bfc2 tempest-ServerRescueTestJSONUnderV235-793800253 tempest-ServerRescueTestJSONUnderV235-793800253-project-member] [instance: f2673a5e-28b0-4a93-b93b-8eef64380e08] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 624.333751] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7b66cdee-eee6-4215-a2b1-3964999c8b56 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 624.342619] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7963a414-6cb5-4a38-ac6d-51b7ef49a1d6 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 624.381280] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b26b46bb-3c50-438e-b579-7b209e5ab9f2 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 624.390027] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-12cd0009-b618-4570-b1da-beeff4c850cd {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 624.406716] env[59369]: DEBUG nova.compute.provider_tree [None req-eca9a096-3b3a-4327-a873-aed0a334bfc2 tempest-ServerRescueTestJSONUnderV235-793800253 tempest-ServerRescueTestJSONUnderV235-793800253-project-member] Inventory has not changed in ProviderTree for provider: 6ace6145-3535-4f74-aa29-80f64a201369 {{(pid=59369) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 624.420611] env[59369]: DEBUG nova.scheduler.client.report [None req-eca9a096-3b3a-4327-a873-aed0a334bfc2 tempest-ServerRescueTestJSONUnderV235-793800253 tempest-ServerRescueTestJSONUnderV235-793800253-project-member] Inventory has not changed for provider 6ace6145-3535-4f74-aa29-80f64a201369 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 116, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59369) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 624.442162] env[59369]: DEBUG oslo_concurrency.lockutils [None req-eca9a096-3b3a-4327-a873-aed0a334bfc2 tempest-ServerRescueTestJSONUnderV235-793800253 tempest-ServerRescueTestJSONUnderV235-793800253-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.331s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 624.442302] env[59369]: DEBUG nova.compute.manager [None req-eca9a096-3b3a-4327-a873-aed0a334bfc2 tempest-ServerRescueTestJSONUnderV235-793800253 tempest-ServerRescueTestJSONUnderV235-793800253-project-member] [instance: f2673a5e-28b0-4a93-b93b-8eef64380e08] Start building networks asynchronously for instance. {{(pid=59369) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 624.488158] env[59369]: DEBUG nova.compute.utils [None req-eca9a096-3b3a-4327-a873-aed0a334bfc2 tempest-ServerRescueTestJSONUnderV235-793800253 tempest-ServerRescueTestJSONUnderV235-793800253-project-member] Using /dev/sd instead of None {{(pid=59369) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 624.490613] env[59369]: DEBUG nova.compute.manager [None req-eca9a096-3b3a-4327-a873-aed0a334bfc2 tempest-ServerRescueTestJSONUnderV235-793800253 tempest-ServerRescueTestJSONUnderV235-793800253-project-member] [instance: f2673a5e-28b0-4a93-b93b-8eef64380e08] Allocating IP information in the background. {{(pid=59369) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 624.490780] env[59369]: DEBUG nova.network.neutron [None req-eca9a096-3b3a-4327-a873-aed0a334bfc2 tempest-ServerRescueTestJSONUnderV235-793800253 tempest-ServerRescueTestJSONUnderV235-793800253-project-member] [instance: f2673a5e-28b0-4a93-b93b-8eef64380e08] allocate_for_instance() {{(pid=59369) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 624.510239] env[59369]: DEBUG nova.compute.manager [None req-eca9a096-3b3a-4327-a873-aed0a334bfc2 tempest-ServerRescueTestJSONUnderV235-793800253 tempest-ServerRescueTestJSONUnderV235-793800253-project-member] [instance: f2673a5e-28b0-4a93-b93b-8eef64380e08] Start building block device mappings for instance. {{(pid=59369) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 624.606982] env[59369]: DEBUG nova.compute.manager [None req-eca9a096-3b3a-4327-a873-aed0a334bfc2 tempest-ServerRescueTestJSONUnderV235-793800253 tempest-ServerRescueTestJSONUnderV235-793800253-project-member] [instance: f2673a5e-28b0-4a93-b93b-8eef64380e08] Start spawning the instance on the hypervisor. {{(pid=59369) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 624.635267] env[59369]: DEBUG nova.virt.hardware [None req-eca9a096-3b3a-4327-a873-aed0a334bfc2 tempest-ServerRescueTestJSONUnderV235-793800253 tempest-ServerRescueTestJSONUnderV235-793800253-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-08-29T11:50:58Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-08-29T11:50:41Z,direct_url=,disk_format='vmdk',id=dc3bad91-06a4-485e-bdc4-b0135384e686,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='7386534ab8334806bb50f77e095e084c',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-08-29T11:50:42Z,virtual_size=,visibility=), allow threads: False {{(pid=59369) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 624.635267] env[59369]: DEBUG nova.virt.hardware [None req-eca9a096-3b3a-4327-a873-aed0a334bfc2 tempest-ServerRescueTestJSONUnderV235-793800253 tempest-ServerRescueTestJSONUnderV235-793800253-project-member] Flavor limits 0:0:0 {{(pid=59369) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 624.635267] env[59369]: DEBUG nova.virt.hardware [None req-eca9a096-3b3a-4327-a873-aed0a334bfc2 tempest-ServerRescueTestJSONUnderV235-793800253 tempest-ServerRescueTestJSONUnderV235-793800253-project-member] Image limits 0:0:0 {{(pid=59369) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 624.635452] env[59369]: DEBUG nova.virt.hardware [None req-eca9a096-3b3a-4327-a873-aed0a334bfc2 tempest-ServerRescueTestJSONUnderV235-793800253 tempest-ServerRescueTestJSONUnderV235-793800253-project-member] Flavor pref 0:0:0 {{(pid=59369) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 624.635452] env[59369]: DEBUG nova.virt.hardware [None req-eca9a096-3b3a-4327-a873-aed0a334bfc2 tempest-ServerRescueTestJSONUnderV235-793800253 tempest-ServerRescueTestJSONUnderV235-793800253-project-member] Image pref 0:0:0 {{(pid=59369) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 624.635452] env[59369]: DEBUG nova.virt.hardware [None req-eca9a096-3b3a-4327-a873-aed0a334bfc2 tempest-ServerRescueTestJSONUnderV235-793800253 tempest-ServerRescueTestJSONUnderV235-793800253-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59369) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 624.635452] env[59369]: DEBUG nova.virt.hardware [None req-eca9a096-3b3a-4327-a873-aed0a334bfc2 tempest-ServerRescueTestJSONUnderV235-793800253 tempest-ServerRescueTestJSONUnderV235-793800253-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59369) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 624.635452] env[59369]: DEBUG nova.virt.hardware [None req-eca9a096-3b3a-4327-a873-aed0a334bfc2 tempest-ServerRescueTestJSONUnderV235-793800253 tempest-ServerRescueTestJSONUnderV235-793800253-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59369) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 624.635593] env[59369]: DEBUG nova.virt.hardware [None req-eca9a096-3b3a-4327-a873-aed0a334bfc2 tempest-ServerRescueTestJSONUnderV235-793800253 tempest-ServerRescueTestJSONUnderV235-793800253-project-member] Got 1 possible topologies {{(pid=59369) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 624.638479] env[59369]: DEBUG nova.virt.hardware [None req-eca9a096-3b3a-4327-a873-aed0a334bfc2 tempest-ServerRescueTestJSONUnderV235-793800253 tempest-ServerRescueTestJSONUnderV235-793800253-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59369) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 624.639078] env[59369]: DEBUG nova.virt.hardware [None req-eca9a096-3b3a-4327-a873-aed0a334bfc2 tempest-ServerRescueTestJSONUnderV235-793800253 tempest-ServerRescueTestJSONUnderV235-793800253-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59369) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 624.639971] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0679898a-4802-4f63-9d73-3b9159c0bd20 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 624.655546] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c610ca08-d95b-4419-a2a2-5219a7914d11 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 624.836508] env[59369]: DEBUG nova.policy [None req-eca9a096-3b3a-4327-a873-aed0a334bfc2 tempest-ServerRescueTestJSONUnderV235-793800253 tempest-ServerRescueTestJSONUnderV235-793800253-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '36c5e471ef814398b33ad3d4d2dc4a8d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6ae7d80622f8452ab598b3fb9c0d5730', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59369) authorize /opt/stack/nova/nova/policy.py:203}} [ 625.075241] env[59369]: DEBUG nova.network.neutron [None req-dd4873a7-db8d-4576-bfb8-ebecfcdc0324 tempest-VolumesAssistedSnapshotsTest-825471597 tempest-VolumesAssistedSnapshotsTest-825471597-project-member] [instance: 3089e10b-f9fd-4049-b8f4-9297fe6a7c86] Updating instance_info_cache with network_info: [{"id": "f065c1ed-1c94-4399-98e7-1137b20994b2", "address": "fa:16:3e:cb:51:ab", "network": {"id": "fc421da7-33a6-4ae6-8d0d-efd58ccfc207", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-76931201-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3d67fc0b199b4807ba31fa60831085aa", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "459b8c74-0aa6-42b6-996a-42b1c5d7e5c6", "external-id": "nsx-vlan-transportzone-467", "segmentation_id": 467, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf065c1ed-1c", "ovs_interfaceid": "f065c1ed-1c94-4399-98e7-1137b20994b2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59369) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 625.095128] env[59369]: DEBUG oslo_concurrency.lockutils [None req-dd4873a7-db8d-4576-bfb8-ebecfcdc0324 tempest-VolumesAssistedSnapshotsTest-825471597 tempest-VolumesAssistedSnapshotsTest-825471597-project-member] Releasing lock "refresh_cache-3089e10b-f9fd-4049-b8f4-9297fe6a7c86" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 625.096717] env[59369]: DEBUG nova.compute.manager [None req-dd4873a7-db8d-4576-bfb8-ebecfcdc0324 tempest-VolumesAssistedSnapshotsTest-825471597 tempest-VolumesAssistedSnapshotsTest-825471597-project-member] [instance: 3089e10b-f9fd-4049-b8f4-9297fe6a7c86] Instance network_info: |[{"id": "f065c1ed-1c94-4399-98e7-1137b20994b2", "address": "fa:16:3e:cb:51:ab", "network": {"id": "fc421da7-33a6-4ae6-8d0d-efd58ccfc207", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-76931201-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3d67fc0b199b4807ba31fa60831085aa", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "459b8c74-0aa6-42b6-996a-42b1c5d7e5c6", "external-id": "nsx-vlan-transportzone-467", "segmentation_id": 467, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf065c1ed-1c", "ovs_interfaceid": "f065c1ed-1c94-4399-98e7-1137b20994b2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59369) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 625.097073] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-dd4873a7-db8d-4576-bfb8-ebecfcdc0324 tempest-VolumesAssistedSnapshotsTest-825471597 tempest-VolumesAssistedSnapshotsTest-825471597-project-member] [instance: 3089e10b-f9fd-4049-b8f4-9297fe6a7c86] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:cb:51:ab', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '459b8c74-0aa6-42b6-996a-42b1c5d7e5c6', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'f065c1ed-1c94-4399-98e7-1137b20994b2', 'vif_model': 'vmxnet3'}] {{(pid=59369) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 625.104975] env[59369]: DEBUG nova.virt.vmwareapi.vm_util [None req-dd4873a7-db8d-4576-bfb8-ebecfcdc0324 tempest-VolumesAssistedSnapshotsTest-825471597 tempest-VolumesAssistedSnapshotsTest-825471597-project-member] Creating folder: Project (3d67fc0b199b4807ba31fa60831085aa). Parent ref: group-v121837. {{(pid=59369) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 625.105486] env[59369]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-a3517319-e3ef-4369-9ef5-cb5b09df666f {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 625.116690] env[59369]: INFO nova.virt.vmwareapi.vm_util [None req-dd4873a7-db8d-4576-bfb8-ebecfcdc0324 tempest-VolumesAssistedSnapshotsTest-825471597 tempest-VolumesAssistedSnapshotsTest-825471597-project-member] Created folder: Project (3d67fc0b199b4807ba31fa60831085aa) in parent group-v121837. [ 625.116690] env[59369]: DEBUG nova.virt.vmwareapi.vm_util [None req-dd4873a7-db8d-4576-bfb8-ebecfcdc0324 tempest-VolumesAssistedSnapshotsTest-825471597 tempest-VolumesAssistedSnapshotsTest-825471597-project-member] Creating folder: Instances. Parent ref: group-v121856. {{(pid=59369) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 625.116822] env[59369]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b05574c0-edba-4033-b8ee-57d9f06d5447 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 625.129021] env[59369]: INFO nova.virt.vmwareapi.vm_util [None req-dd4873a7-db8d-4576-bfb8-ebecfcdc0324 tempest-VolumesAssistedSnapshotsTest-825471597 tempest-VolumesAssistedSnapshotsTest-825471597-project-member] Created folder: Instances in parent group-v121856. [ 625.129021] env[59369]: DEBUG oslo.service.loopingcall [None req-dd4873a7-db8d-4576-bfb8-ebecfcdc0324 tempest-VolumesAssistedSnapshotsTest-825471597 tempest-VolumesAssistedSnapshotsTest-825471597-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59369) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 625.129021] env[59369]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3089e10b-f9fd-4049-b8f4-9297fe6a7c86] Creating VM on the ESX host {{(pid=59369) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 625.129021] env[59369]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-6ac1a3ee-a432-42bb-a727-b0959842a9c8 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 625.156286] env[59369]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 625.156286] env[59369]: value = "task-463241" [ 625.156286] env[59369]: _type = "Task" [ 625.156286] env[59369]: } to complete. {{(pid=59369) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 625.165450] env[59369]: DEBUG oslo_vmware.api [-] Task: {'id': task-463241, 'name': CreateVM_Task} progress is 0%. {{(pid=59369) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 625.467405] env[59369]: DEBUG nova.network.neutron [None req-4e3b8987-3580-41ec-86db-2724b58f23fe tempest-TenantUsagesTestJSON-1555548295 tempest-TenantUsagesTestJSON-1555548295-project-member] [instance: 3117e247-0538-4a30-a0d7-aa47247a6da1] Successfully created port: 280d52b0-daa4-4c93-894d-c38fe3c24e33 {{(pid=59369) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 625.665405] env[59369]: DEBUG oslo_vmware.api [-] Task: {'id': task-463241, 'name': CreateVM_Task, 'duration_secs': 0.314818} completed successfully. {{(pid=59369) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 625.665405] env[59369]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3089e10b-f9fd-4049-b8f4-9297fe6a7c86] Created VM on the ESX host {{(pid=59369) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 625.666180] env[59369]: DEBUG oslo_concurrency.lockutils [None req-dd4873a7-db8d-4576-bfb8-ebecfcdc0324 tempest-VolumesAssistedSnapshotsTest-825471597 tempest-VolumesAssistedSnapshotsTest-825471597-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/dc3bad91-06a4-485e-bdc4-b0135384e686" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 625.666180] env[59369]: DEBUG oslo_concurrency.lockutils [None req-dd4873a7-db8d-4576-bfb8-ebecfcdc0324 tempest-VolumesAssistedSnapshotsTest-825471597 tempest-VolumesAssistedSnapshotsTest-825471597-project-member] Acquired lock "[datastore1] devstack-image-cache_base/dc3bad91-06a4-485e-bdc4-b0135384e686" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 625.666346] env[59369]: DEBUG oslo_concurrency.lockutils [None req-dd4873a7-db8d-4576-bfb8-ebecfcdc0324 tempest-VolumesAssistedSnapshotsTest-825471597 tempest-VolumesAssistedSnapshotsTest-825471597-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/dc3bad91-06a4-485e-bdc4-b0135384e686" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 625.666943] env[59369]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-f0293de4-abf9-4447-b4ad-85448a0fcce3 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 625.671929] env[59369]: DEBUG oslo_vmware.api [None req-dd4873a7-db8d-4576-bfb8-ebecfcdc0324 tempest-VolumesAssistedSnapshotsTest-825471597 tempest-VolumesAssistedSnapshotsTest-825471597-project-member] Waiting for the task: (returnval){ [ 625.671929] env[59369]: value = "session[52794835-9974-5e05-43d4-e917bd2adbc9]525b4a11-d545-97e4-96fa-4665776cc7e8" [ 625.671929] env[59369]: _type = "Task" [ 625.671929] env[59369]: } to complete. {{(pid=59369) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 625.681621] env[59369]: DEBUG oslo_vmware.api [None req-dd4873a7-db8d-4576-bfb8-ebecfcdc0324 tempest-VolumesAssistedSnapshotsTest-825471597 tempest-VolumesAssistedSnapshotsTest-825471597-project-member] Task: {'id': session[52794835-9974-5e05-43d4-e917bd2adbc9]525b4a11-d545-97e4-96fa-4665776cc7e8, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59369) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 626.184109] env[59369]: DEBUG oslo_concurrency.lockutils [None req-dd4873a7-db8d-4576-bfb8-ebecfcdc0324 tempest-VolumesAssistedSnapshotsTest-825471597 tempest-VolumesAssistedSnapshotsTest-825471597-project-member] Releasing lock "[datastore1] devstack-image-cache_base/dc3bad91-06a4-485e-bdc4-b0135384e686" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 626.184109] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-dd4873a7-db8d-4576-bfb8-ebecfcdc0324 tempest-VolumesAssistedSnapshotsTest-825471597 tempest-VolumesAssistedSnapshotsTest-825471597-project-member] [instance: 3089e10b-f9fd-4049-b8f4-9297fe6a7c86] Processing image dc3bad91-06a4-485e-bdc4-b0135384e686 {{(pid=59369) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 626.184109] env[59369]: DEBUG oslo_concurrency.lockutils [None req-dd4873a7-db8d-4576-bfb8-ebecfcdc0324 tempest-VolumesAssistedSnapshotsTest-825471597 tempest-VolumesAssistedSnapshotsTest-825471597-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/dc3bad91-06a4-485e-bdc4-b0135384e686/dc3bad91-06a4-485e-bdc4-b0135384e686.vmdk" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 626.455144] env[59369]: DEBUG nova.compute.manager [req-48e8884b-f12a-4126-b7e6-fbdf3cebf862 req-92158715-6081-48cd-95e4-6c101a587cd0 service nova] [instance: ad662e06-1a0f-4110-8d23-8ff6c6889eee] Received event network-changed-c4b18c68-0b74-48de-8e67-aeda476d896a {{(pid=59369) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 626.455144] env[59369]: DEBUG nova.compute.manager [req-48e8884b-f12a-4126-b7e6-fbdf3cebf862 req-92158715-6081-48cd-95e4-6c101a587cd0 service nova] [instance: ad662e06-1a0f-4110-8d23-8ff6c6889eee] Refreshing instance network info cache due to event network-changed-c4b18c68-0b74-48de-8e67-aeda476d896a. {{(pid=59369) external_instance_event /opt/stack/nova/nova/compute/manager.py:11007}} [ 626.455534] env[59369]: DEBUG oslo_concurrency.lockutils [req-48e8884b-f12a-4126-b7e6-fbdf3cebf862 req-92158715-6081-48cd-95e4-6c101a587cd0 service nova] Acquiring lock "refresh_cache-ad662e06-1a0f-4110-8d23-8ff6c6889eee" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 626.456216] env[59369]: DEBUG oslo_concurrency.lockutils [req-48e8884b-f12a-4126-b7e6-fbdf3cebf862 req-92158715-6081-48cd-95e4-6c101a587cd0 service nova] Acquired lock "refresh_cache-ad662e06-1a0f-4110-8d23-8ff6c6889eee" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 626.456680] env[59369]: DEBUG nova.network.neutron [req-48e8884b-f12a-4126-b7e6-fbdf3cebf862 req-92158715-6081-48cd-95e4-6c101a587cd0 service nova] [instance: ad662e06-1a0f-4110-8d23-8ff6c6889eee] Refreshing network info cache for port c4b18c68-0b74-48de-8e67-aeda476d896a {{(pid=59369) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 627.018410] env[59369]: DEBUG nova.network.neutron [None req-eca9a096-3b3a-4327-a873-aed0a334bfc2 tempest-ServerRescueTestJSONUnderV235-793800253 tempest-ServerRescueTestJSONUnderV235-793800253-project-member] [instance: f2673a5e-28b0-4a93-b93b-8eef64380e08] Successfully created port: 74fca338-207c-4385-9caf-13ceb8de1245 {{(pid=59369) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 628.169529] env[59369]: DEBUG nova.network.neutron [req-48e8884b-f12a-4126-b7e6-fbdf3cebf862 req-92158715-6081-48cd-95e4-6c101a587cd0 service nova] [instance: ad662e06-1a0f-4110-8d23-8ff6c6889eee] Updated VIF entry in instance network info cache for port c4b18c68-0b74-48de-8e67-aeda476d896a. {{(pid=59369) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 628.170012] env[59369]: DEBUG nova.network.neutron [req-48e8884b-f12a-4126-b7e6-fbdf3cebf862 req-92158715-6081-48cd-95e4-6c101a587cd0 service nova] [instance: ad662e06-1a0f-4110-8d23-8ff6c6889eee] Updating instance_info_cache with network_info: [{"id": "c4b18c68-0b74-48de-8e67-aeda476d896a", "address": "fa:16:3e:9b:17:45", "network": {"id": "b9eff4e8-f1ca-450e-848a-97a956388f68", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.63", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "7386534ab8334806bb50f77e095e084c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ad72c645-a67d-4efd-b563-28e44077e68d", "external-id": "nsx-vlan-transportzone-201", "segmentation_id": 201, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc4b18c68-0b", "ovs_interfaceid": "c4b18c68-0b74-48de-8e67-aeda476d896a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59369) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 628.179696] env[59369]: DEBUG oslo_concurrency.lockutils [req-48e8884b-f12a-4126-b7e6-fbdf3cebf862 req-92158715-6081-48cd-95e4-6c101a587cd0 service nova] Releasing lock "refresh_cache-ad662e06-1a0f-4110-8d23-8ff6c6889eee" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 628.180033] env[59369]: DEBUG nova.compute.manager [req-48e8884b-f12a-4126-b7e6-fbdf3cebf862 req-92158715-6081-48cd-95e4-6c101a587cd0 service nova] [instance: 3089e10b-f9fd-4049-b8f4-9297fe6a7c86] Received event network-vif-plugged-f065c1ed-1c94-4399-98e7-1137b20994b2 {{(pid=59369) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 628.180119] env[59369]: DEBUG oslo_concurrency.lockutils [req-48e8884b-f12a-4126-b7e6-fbdf3cebf862 req-92158715-6081-48cd-95e4-6c101a587cd0 service nova] Acquiring lock "3089e10b-f9fd-4049-b8f4-9297fe6a7c86-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 628.180307] env[59369]: DEBUG oslo_concurrency.lockutils [req-48e8884b-f12a-4126-b7e6-fbdf3cebf862 req-92158715-6081-48cd-95e4-6c101a587cd0 service nova] Lock "3089e10b-f9fd-4049-b8f4-9297fe6a7c86-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 628.180457] env[59369]: DEBUG oslo_concurrency.lockutils [req-48e8884b-f12a-4126-b7e6-fbdf3cebf862 req-92158715-6081-48cd-95e4-6c101a587cd0 service nova] Lock "3089e10b-f9fd-4049-b8f4-9297fe6a7c86-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 628.180605] env[59369]: DEBUG nova.compute.manager [req-48e8884b-f12a-4126-b7e6-fbdf3cebf862 req-92158715-6081-48cd-95e4-6c101a587cd0 service nova] [instance: 3089e10b-f9fd-4049-b8f4-9297fe6a7c86] No waiting events found dispatching network-vif-plugged-f065c1ed-1c94-4399-98e7-1137b20994b2 {{(pid=59369) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 628.180943] env[59369]: WARNING nova.compute.manager [req-48e8884b-f12a-4126-b7e6-fbdf3cebf862 req-92158715-6081-48cd-95e4-6c101a587cd0 service nova] [instance: 3089e10b-f9fd-4049-b8f4-9297fe6a7c86] Received unexpected event network-vif-plugged-f065c1ed-1c94-4399-98e7-1137b20994b2 for instance with vm_state building and task_state spawning. [ 628.873283] env[59369]: DEBUG nova.network.neutron [None req-4e3b8987-3580-41ec-86db-2724b58f23fe tempest-TenantUsagesTestJSON-1555548295 tempest-TenantUsagesTestJSON-1555548295-project-member] [instance: 3117e247-0538-4a30-a0d7-aa47247a6da1] Successfully updated port: 280d52b0-daa4-4c93-894d-c38fe3c24e33 {{(pid=59369) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 628.885734] env[59369]: DEBUG oslo_concurrency.lockutils [None req-4e3b8987-3580-41ec-86db-2724b58f23fe tempest-TenantUsagesTestJSON-1555548295 tempest-TenantUsagesTestJSON-1555548295-project-member] Acquiring lock "refresh_cache-3117e247-0538-4a30-a0d7-aa47247a6da1" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 628.885878] env[59369]: DEBUG oslo_concurrency.lockutils [None req-4e3b8987-3580-41ec-86db-2724b58f23fe tempest-TenantUsagesTestJSON-1555548295 tempest-TenantUsagesTestJSON-1555548295-project-member] Acquired lock "refresh_cache-3117e247-0538-4a30-a0d7-aa47247a6da1" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 628.886043] env[59369]: DEBUG nova.network.neutron [None req-4e3b8987-3580-41ec-86db-2724b58f23fe tempest-TenantUsagesTestJSON-1555548295 tempest-TenantUsagesTestJSON-1555548295-project-member] [instance: 3117e247-0538-4a30-a0d7-aa47247a6da1] Building network info cache for instance {{(pid=59369) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 629.278784] env[59369]: DEBUG nova.network.neutron [None req-4e3b8987-3580-41ec-86db-2724b58f23fe tempest-TenantUsagesTestJSON-1555548295 tempest-TenantUsagesTestJSON-1555548295-project-member] [instance: 3117e247-0538-4a30-a0d7-aa47247a6da1] Instance cache missing network info. {{(pid=59369) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 629.935267] env[59369]: DEBUG nova.network.neutron [None req-4e3b8987-3580-41ec-86db-2724b58f23fe tempest-TenantUsagesTestJSON-1555548295 tempest-TenantUsagesTestJSON-1555548295-project-member] [instance: 3117e247-0538-4a30-a0d7-aa47247a6da1] Updating instance_info_cache with network_info: [{"id": "280d52b0-daa4-4c93-894d-c38fe3c24e33", "address": "fa:16:3e:e9:88:bf", "network": {"id": "b9eff4e8-f1ca-450e-848a-97a956388f68", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.97", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "7386534ab8334806bb50f77e095e084c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ad72c645-a67d-4efd-b563-28e44077e68d", "external-id": "nsx-vlan-transportzone-201", "segmentation_id": 201, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap280d52b0-da", "ovs_interfaceid": "280d52b0-daa4-4c93-894d-c38fe3c24e33", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59369) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 629.950684] env[59369]: DEBUG oslo_concurrency.lockutils [None req-4e3b8987-3580-41ec-86db-2724b58f23fe tempest-TenantUsagesTestJSON-1555548295 tempest-TenantUsagesTestJSON-1555548295-project-member] Releasing lock "refresh_cache-3117e247-0538-4a30-a0d7-aa47247a6da1" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 629.951064] env[59369]: DEBUG nova.compute.manager [None req-4e3b8987-3580-41ec-86db-2724b58f23fe tempest-TenantUsagesTestJSON-1555548295 tempest-TenantUsagesTestJSON-1555548295-project-member] [instance: 3117e247-0538-4a30-a0d7-aa47247a6da1] Instance network_info: |[{"id": "280d52b0-daa4-4c93-894d-c38fe3c24e33", "address": "fa:16:3e:e9:88:bf", "network": {"id": "b9eff4e8-f1ca-450e-848a-97a956388f68", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.97", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "7386534ab8334806bb50f77e095e084c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ad72c645-a67d-4efd-b563-28e44077e68d", "external-id": "nsx-vlan-transportzone-201", "segmentation_id": 201, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap280d52b0-da", "ovs_interfaceid": "280d52b0-daa4-4c93-894d-c38fe3c24e33", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59369) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 629.951638] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-4e3b8987-3580-41ec-86db-2724b58f23fe tempest-TenantUsagesTestJSON-1555548295 tempest-TenantUsagesTestJSON-1555548295-project-member] [instance: 3117e247-0538-4a30-a0d7-aa47247a6da1] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:e9:88:bf', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ad72c645-a67d-4efd-b563-28e44077e68d', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '280d52b0-daa4-4c93-894d-c38fe3c24e33', 'vif_model': 'vmxnet3'}] {{(pid=59369) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 629.960845] env[59369]: DEBUG nova.virt.vmwareapi.vm_util [None req-4e3b8987-3580-41ec-86db-2724b58f23fe tempest-TenantUsagesTestJSON-1555548295 tempest-TenantUsagesTestJSON-1555548295-project-member] Creating folder: Project (82808309113e47c481e8b7a9e958594a). Parent ref: group-v121837. {{(pid=59369) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 629.961625] env[59369]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-33cd8c5d-5633-462d-8331-c4f8da42a559 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 629.973477] env[59369]: INFO nova.virt.vmwareapi.vm_util [None req-4e3b8987-3580-41ec-86db-2724b58f23fe tempest-TenantUsagesTestJSON-1555548295 tempest-TenantUsagesTestJSON-1555548295-project-member] Created folder: Project (82808309113e47c481e8b7a9e958594a) in parent group-v121837. [ 629.973719] env[59369]: DEBUG nova.virt.vmwareapi.vm_util [None req-4e3b8987-3580-41ec-86db-2724b58f23fe tempest-TenantUsagesTestJSON-1555548295 tempest-TenantUsagesTestJSON-1555548295-project-member] Creating folder: Instances. Parent ref: group-v121859. {{(pid=59369) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 629.973948] env[59369]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c1a7d215-238d-4a9b-a84f-38ca63018217 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 629.983446] env[59369]: INFO nova.virt.vmwareapi.vm_util [None req-4e3b8987-3580-41ec-86db-2724b58f23fe tempest-TenantUsagesTestJSON-1555548295 tempest-TenantUsagesTestJSON-1555548295-project-member] Created folder: Instances in parent group-v121859. [ 629.983918] env[59369]: DEBUG oslo.service.loopingcall [None req-4e3b8987-3580-41ec-86db-2724b58f23fe tempest-TenantUsagesTestJSON-1555548295 tempest-TenantUsagesTestJSON-1555548295-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59369) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 629.983918] env[59369]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3117e247-0538-4a30-a0d7-aa47247a6da1] Creating VM on the ESX host {{(pid=59369) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 629.983918] env[59369]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-3ef64066-3a53-4255-a79d-c0b9ee2c6188 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 630.005021] env[59369]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 630.005021] env[59369]: value = "task-463244" [ 630.005021] env[59369]: _type = "Task" [ 630.005021] env[59369]: } to complete. {{(pid=59369) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 630.014448] env[59369]: DEBUG oslo_vmware.api [-] Task: {'id': task-463244, 'name': CreateVM_Task} progress is 0%. {{(pid=59369) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 630.217432] env[59369]: DEBUG nova.network.neutron [None req-eca9a096-3b3a-4327-a873-aed0a334bfc2 tempest-ServerRescueTestJSONUnderV235-793800253 tempest-ServerRescueTestJSONUnderV235-793800253-project-member] [instance: f2673a5e-28b0-4a93-b93b-8eef64380e08] Successfully updated port: 74fca338-207c-4385-9caf-13ceb8de1245 {{(pid=59369) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 630.234271] env[59369]: DEBUG oslo_concurrency.lockutils [None req-eca9a096-3b3a-4327-a873-aed0a334bfc2 tempest-ServerRescueTestJSONUnderV235-793800253 tempest-ServerRescueTestJSONUnderV235-793800253-project-member] Acquiring lock "refresh_cache-f2673a5e-28b0-4a93-b93b-8eef64380e08" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 630.234271] env[59369]: DEBUG oslo_concurrency.lockutils [None req-eca9a096-3b3a-4327-a873-aed0a334bfc2 tempest-ServerRescueTestJSONUnderV235-793800253 tempest-ServerRescueTestJSONUnderV235-793800253-project-member] Acquired lock "refresh_cache-f2673a5e-28b0-4a93-b93b-8eef64380e08" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 630.234271] env[59369]: DEBUG nova.network.neutron [None req-eca9a096-3b3a-4327-a873-aed0a334bfc2 tempest-ServerRescueTestJSONUnderV235-793800253 tempest-ServerRescueTestJSONUnderV235-793800253-project-member] [instance: f2673a5e-28b0-4a93-b93b-8eef64380e08] Building network info cache for instance {{(pid=59369) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 630.516092] env[59369]: DEBUG oslo_vmware.api [-] Task: {'id': task-463244, 'name': CreateVM_Task, 'duration_secs': 0.321716} completed successfully. {{(pid=59369) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 630.516781] env[59369]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3117e247-0538-4a30-a0d7-aa47247a6da1] Created VM on the ESX host {{(pid=59369) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 630.516831] env[59369]: DEBUG oslo_concurrency.lockutils [None req-4e3b8987-3580-41ec-86db-2724b58f23fe tempest-TenantUsagesTestJSON-1555548295 tempest-TenantUsagesTestJSON-1555548295-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/dc3bad91-06a4-485e-bdc4-b0135384e686" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 630.516994] env[59369]: DEBUG oslo_concurrency.lockutils [None req-4e3b8987-3580-41ec-86db-2724b58f23fe tempest-TenantUsagesTestJSON-1555548295 tempest-TenantUsagesTestJSON-1555548295-project-member] Acquired lock "[datastore1] devstack-image-cache_base/dc3bad91-06a4-485e-bdc4-b0135384e686" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 630.517278] env[59369]: DEBUG oslo_concurrency.lockutils [None req-4e3b8987-3580-41ec-86db-2724b58f23fe tempest-TenantUsagesTestJSON-1555548295 tempest-TenantUsagesTestJSON-1555548295-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/dc3bad91-06a4-485e-bdc4-b0135384e686" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 630.517507] env[59369]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-0f1ccfca-bf55-4a2d-b477-4f4bd6251915 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 630.522368] env[59369]: DEBUG oslo_vmware.api [None req-4e3b8987-3580-41ec-86db-2724b58f23fe tempest-TenantUsagesTestJSON-1555548295 tempest-TenantUsagesTestJSON-1555548295-project-member] Waiting for the task: (returnval){ [ 630.522368] env[59369]: value = "session[52794835-9974-5e05-43d4-e917bd2adbc9]52146a44-4893-9789-9246-1956e5e75c18" [ 630.522368] env[59369]: _type = "Task" [ 630.522368] env[59369]: } to complete. {{(pid=59369) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 630.530380] env[59369]: DEBUG oslo_vmware.api [None req-4e3b8987-3580-41ec-86db-2724b58f23fe tempest-TenantUsagesTestJSON-1555548295 tempest-TenantUsagesTestJSON-1555548295-project-member] Task: {'id': session[52794835-9974-5e05-43d4-e917bd2adbc9]52146a44-4893-9789-9246-1956e5e75c18, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59369) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 630.572523] env[59369]: DEBUG nova.network.neutron [None req-eca9a096-3b3a-4327-a873-aed0a334bfc2 tempest-ServerRescueTestJSONUnderV235-793800253 tempest-ServerRescueTestJSONUnderV235-793800253-project-member] [instance: f2673a5e-28b0-4a93-b93b-8eef64380e08] Instance cache missing network info. {{(pid=59369) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 631.036839] env[59369]: DEBUG oslo_concurrency.lockutils [None req-4e3b8987-3580-41ec-86db-2724b58f23fe tempest-TenantUsagesTestJSON-1555548295 tempest-TenantUsagesTestJSON-1555548295-project-member] Releasing lock "[datastore1] devstack-image-cache_base/dc3bad91-06a4-485e-bdc4-b0135384e686" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 631.037132] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-4e3b8987-3580-41ec-86db-2724b58f23fe tempest-TenantUsagesTestJSON-1555548295 tempest-TenantUsagesTestJSON-1555548295-project-member] [instance: 3117e247-0538-4a30-a0d7-aa47247a6da1] Processing image dc3bad91-06a4-485e-bdc4-b0135384e686 {{(pid=59369) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 631.037334] env[59369]: DEBUG oslo_concurrency.lockutils [None req-4e3b8987-3580-41ec-86db-2724b58f23fe tempest-TenantUsagesTestJSON-1555548295 tempest-TenantUsagesTestJSON-1555548295-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/dc3bad91-06a4-485e-bdc4-b0135384e686/dc3bad91-06a4-485e-bdc4-b0135384e686.vmdk" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 631.256418] env[59369]: DEBUG nova.network.neutron [None req-eca9a096-3b3a-4327-a873-aed0a334bfc2 tempest-ServerRescueTestJSONUnderV235-793800253 tempest-ServerRescueTestJSONUnderV235-793800253-project-member] [instance: f2673a5e-28b0-4a93-b93b-8eef64380e08] Updating instance_info_cache with network_info: [{"id": "74fca338-207c-4385-9caf-13ceb8de1245", "address": "fa:16:3e:18:9e:cb", "network": {"id": "9d65de8c-f6d0-451e-8e87-e54b49363db9", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1811914259-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "6ae7d80622f8452ab598b3fb9c0d5730", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f66f8375-4460-4acd-987b-acda72bfcf0d", "external-id": "nsx-vlan-transportzone-533", "segmentation_id": 533, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap74fca338-20", "ovs_interfaceid": "74fca338-207c-4385-9caf-13ceb8de1245", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59369) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 631.268014] env[59369]: DEBUG oslo_concurrency.lockutils [None req-eca9a096-3b3a-4327-a873-aed0a334bfc2 tempest-ServerRescueTestJSONUnderV235-793800253 tempest-ServerRescueTestJSONUnderV235-793800253-project-member] Releasing lock "refresh_cache-f2673a5e-28b0-4a93-b93b-8eef64380e08" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 631.268014] env[59369]: DEBUG nova.compute.manager [None req-eca9a096-3b3a-4327-a873-aed0a334bfc2 tempest-ServerRescueTestJSONUnderV235-793800253 tempest-ServerRescueTestJSONUnderV235-793800253-project-member] [instance: f2673a5e-28b0-4a93-b93b-8eef64380e08] Instance network_info: |[{"id": "74fca338-207c-4385-9caf-13ceb8de1245", "address": "fa:16:3e:18:9e:cb", "network": {"id": "9d65de8c-f6d0-451e-8e87-e54b49363db9", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1811914259-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "6ae7d80622f8452ab598b3fb9c0d5730", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f66f8375-4460-4acd-987b-acda72bfcf0d", "external-id": "nsx-vlan-transportzone-533", "segmentation_id": 533, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap74fca338-20", "ovs_interfaceid": "74fca338-207c-4385-9caf-13ceb8de1245", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59369) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 631.268562] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-eca9a096-3b3a-4327-a873-aed0a334bfc2 tempest-ServerRescueTestJSONUnderV235-793800253 tempest-ServerRescueTestJSONUnderV235-793800253-project-member] [instance: f2673a5e-28b0-4a93-b93b-8eef64380e08] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:18:9e:cb', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'f66f8375-4460-4acd-987b-acda72bfcf0d', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '74fca338-207c-4385-9caf-13ceb8de1245', 'vif_model': 'vmxnet3'}] {{(pid=59369) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 631.278282] env[59369]: DEBUG nova.virt.vmwareapi.vm_util [None req-eca9a096-3b3a-4327-a873-aed0a334bfc2 tempest-ServerRescueTestJSONUnderV235-793800253 tempest-ServerRescueTestJSONUnderV235-793800253-project-member] Creating folder: Project (6ae7d80622f8452ab598b3fb9c0d5730). Parent ref: group-v121837. {{(pid=59369) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 631.279099] env[59369]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f7fd6f8a-2005-492f-8451-504bd7897722 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 631.293599] env[59369]: INFO nova.virt.vmwareapi.vm_util [None req-eca9a096-3b3a-4327-a873-aed0a334bfc2 tempest-ServerRescueTestJSONUnderV235-793800253 tempest-ServerRescueTestJSONUnderV235-793800253-project-member] Created folder: Project (6ae7d80622f8452ab598b3fb9c0d5730) in parent group-v121837. [ 631.293599] env[59369]: DEBUG nova.virt.vmwareapi.vm_util [None req-eca9a096-3b3a-4327-a873-aed0a334bfc2 tempest-ServerRescueTestJSONUnderV235-793800253 tempest-ServerRescueTestJSONUnderV235-793800253-project-member] Creating folder: Instances. Parent ref: group-v121862. {{(pid=59369) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 631.293599] env[59369]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-38abdab8-8fc5-4a0b-81a6-7779b7577f25 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 631.302086] env[59369]: INFO nova.virt.vmwareapi.vm_util [None req-eca9a096-3b3a-4327-a873-aed0a334bfc2 tempest-ServerRescueTestJSONUnderV235-793800253 tempest-ServerRescueTestJSONUnderV235-793800253-project-member] Created folder: Instances in parent group-v121862. [ 631.302330] env[59369]: DEBUG oslo.service.loopingcall [None req-eca9a096-3b3a-4327-a873-aed0a334bfc2 tempest-ServerRescueTestJSONUnderV235-793800253 tempest-ServerRescueTestJSONUnderV235-793800253-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59369) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 631.302516] env[59369]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f2673a5e-28b0-4a93-b93b-8eef64380e08] Creating VM on the ESX host {{(pid=59369) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 631.302878] env[59369]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-c7d7d065-acd9-46c3-81a1-0216d5e60401 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 631.325533] env[59369]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 631.325533] env[59369]: value = "task-463247" [ 631.325533] env[59369]: _type = "Task" [ 631.325533] env[59369]: } to complete. {{(pid=59369) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 631.336160] env[59369]: DEBUG oslo_vmware.api [-] Task: {'id': task-463247, 'name': CreateVM_Task} progress is 0%. {{(pid=59369) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 631.841303] env[59369]: DEBUG oslo_vmware.api [-] Task: {'id': task-463247, 'name': CreateVM_Task, 'duration_secs': 0.302139} completed successfully. {{(pid=59369) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 631.841601] env[59369]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f2673a5e-28b0-4a93-b93b-8eef64380e08] Created VM on the ESX host {{(pid=59369) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 631.842559] env[59369]: DEBUG oslo_concurrency.lockutils [None req-eca9a096-3b3a-4327-a873-aed0a334bfc2 tempest-ServerRescueTestJSONUnderV235-793800253 tempest-ServerRescueTestJSONUnderV235-793800253-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/dc3bad91-06a4-485e-bdc4-b0135384e686" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 631.842753] env[59369]: DEBUG oslo_concurrency.lockutils [None req-eca9a096-3b3a-4327-a873-aed0a334bfc2 tempest-ServerRescueTestJSONUnderV235-793800253 tempest-ServerRescueTestJSONUnderV235-793800253-project-member] Acquired lock "[datastore1] devstack-image-cache_base/dc3bad91-06a4-485e-bdc4-b0135384e686" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 631.843554] env[59369]: DEBUG oslo_concurrency.lockutils [None req-eca9a096-3b3a-4327-a873-aed0a334bfc2 tempest-ServerRescueTestJSONUnderV235-793800253 tempest-ServerRescueTestJSONUnderV235-793800253-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/dc3bad91-06a4-485e-bdc4-b0135384e686" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 631.843851] env[59369]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-8094c6ed-6699-476b-b146-1aa9f7fe49b5 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 631.849334] env[59369]: DEBUG oslo_vmware.api [None req-eca9a096-3b3a-4327-a873-aed0a334bfc2 tempest-ServerRescueTestJSONUnderV235-793800253 tempest-ServerRescueTestJSONUnderV235-793800253-project-member] Waiting for the task: (returnval){ [ 631.849334] env[59369]: value = "session[52794835-9974-5e05-43d4-e917bd2adbc9]529ad5db-b575-dc11-c7b2-281c4837c1e6" [ 631.849334] env[59369]: _type = "Task" [ 631.849334] env[59369]: } to complete. {{(pid=59369) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 631.858300] env[59369]: DEBUG oslo_vmware.api [None req-eca9a096-3b3a-4327-a873-aed0a334bfc2 tempest-ServerRescueTestJSONUnderV235-793800253 tempest-ServerRescueTestJSONUnderV235-793800253-project-member] Task: {'id': session[52794835-9974-5e05-43d4-e917bd2adbc9]529ad5db-b575-dc11-c7b2-281c4837c1e6, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59369) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 632.365044] env[59369]: DEBUG oslo_concurrency.lockutils [None req-eca9a096-3b3a-4327-a873-aed0a334bfc2 tempest-ServerRescueTestJSONUnderV235-793800253 tempest-ServerRescueTestJSONUnderV235-793800253-project-member] Releasing lock "[datastore1] devstack-image-cache_base/dc3bad91-06a4-485e-bdc4-b0135384e686" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 632.365044] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-eca9a096-3b3a-4327-a873-aed0a334bfc2 tempest-ServerRescueTestJSONUnderV235-793800253 tempest-ServerRescueTestJSONUnderV235-793800253-project-member] [instance: f2673a5e-28b0-4a93-b93b-8eef64380e08] Processing image dc3bad91-06a4-485e-bdc4-b0135384e686 {{(pid=59369) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 632.365044] env[59369]: DEBUG oslo_concurrency.lockutils [None req-eca9a096-3b3a-4327-a873-aed0a334bfc2 tempest-ServerRescueTestJSONUnderV235-793800253 tempest-ServerRescueTestJSONUnderV235-793800253-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/dc3bad91-06a4-485e-bdc4-b0135384e686/dc3bad91-06a4-485e-bdc4-b0135384e686.vmdk" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 633.327825] env[59369]: DEBUG nova.compute.manager [req-6b189862-6bdd-4ad8-8cab-83375cabd30c req-6cd07d4f-b20b-43b7-aa95-7bad1e761715 service nova] [instance: 3089e10b-f9fd-4049-b8f4-9297fe6a7c86] Received event network-changed-f065c1ed-1c94-4399-98e7-1137b20994b2 {{(pid=59369) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 633.328070] env[59369]: DEBUG nova.compute.manager [req-6b189862-6bdd-4ad8-8cab-83375cabd30c req-6cd07d4f-b20b-43b7-aa95-7bad1e761715 service nova] [instance: 3089e10b-f9fd-4049-b8f4-9297fe6a7c86] Refreshing instance network info cache due to event network-changed-f065c1ed-1c94-4399-98e7-1137b20994b2. {{(pid=59369) external_instance_event /opt/stack/nova/nova/compute/manager.py:11007}} [ 633.328216] env[59369]: DEBUG oslo_concurrency.lockutils [req-6b189862-6bdd-4ad8-8cab-83375cabd30c req-6cd07d4f-b20b-43b7-aa95-7bad1e761715 service nova] Acquiring lock "refresh_cache-3089e10b-f9fd-4049-b8f4-9297fe6a7c86" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 633.328346] env[59369]: DEBUG oslo_concurrency.lockutils [req-6b189862-6bdd-4ad8-8cab-83375cabd30c req-6cd07d4f-b20b-43b7-aa95-7bad1e761715 service nova] Acquired lock "refresh_cache-3089e10b-f9fd-4049-b8f4-9297fe6a7c86" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 633.328492] env[59369]: DEBUG nova.network.neutron [req-6b189862-6bdd-4ad8-8cab-83375cabd30c req-6cd07d4f-b20b-43b7-aa95-7bad1e761715 service nova] [instance: 3089e10b-f9fd-4049-b8f4-9297fe6a7c86] Refreshing network info cache for port f065c1ed-1c94-4399-98e7-1137b20994b2 {{(pid=59369) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 634.832012] env[59369]: DEBUG nova.network.neutron [req-6b189862-6bdd-4ad8-8cab-83375cabd30c req-6cd07d4f-b20b-43b7-aa95-7bad1e761715 service nova] [instance: 3089e10b-f9fd-4049-b8f4-9297fe6a7c86] Updated VIF entry in instance network info cache for port f065c1ed-1c94-4399-98e7-1137b20994b2. {{(pid=59369) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 634.832012] env[59369]: DEBUG nova.network.neutron [req-6b189862-6bdd-4ad8-8cab-83375cabd30c req-6cd07d4f-b20b-43b7-aa95-7bad1e761715 service nova] [instance: 3089e10b-f9fd-4049-b8f4-9297fe6a7c86] Updating instance_info_cache with network_info: [{"id": "f065c1ed-1c94-4399-98e7-1137b20994b2", "address": "fa:16:3e:cb:51:ab", "network": {"id": "fc421da7-33a6-4ae6-8d0d-efd58ccfc207", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-76931201-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3d67fc0b199b4807ba31fa60831085aa", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "459b8c74-0aa6-42b6-996a-42b1c5d7e5c6", "external-id": "nsx-vlan-transportzone-467", "segmentation_id": 467, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf065c1ed-1c", "ovs_interfaceid": "f065c1ed-1c94-4399-98e7-1137b20994b2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59369) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 634.844862] env[59369]: DEBUG oslo_concurrency.lockutils [req-6b189862-6bdd-4ad8-8cab-83375cabd30c req-6cd07d4f-b20b-43b7-aa95-7bad1e761715 service nova] Releasing lock "refresh_cache-3089e10b-f9fd-4049-b8f4-9297fe6a7c86" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 634.844862] env[59369]: DEBUG nova.compute.manager [req-6b189862-6bdd-4ad8-8cab-83375cabd30c req-6cd07d4f-b20b-43b7-aa95-7bad1e761715 service nova] [instance: 3117e247-0538-4a30-a0d7-aa47247a6da1] Received event network-vif-plugged-280d52b0-daa4-4c93-894d-c38fe3c24e33 {{(pid=59369) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 634.844862] env[59369]: DEBUG oslo_concurrency.lockutils [req-6b189862-6bdd-4ad8-8cab-83375cabd30c req-6cd07d4f-b20b-43b7-aa95-7bad1e761715 service nova] Acquiring lock "3117e247-0538-4a30-a0d7-aa47247a6da1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 634.844862] env[59369]: DEBUG oslo_concurrency.lockutils [req-6b189862-6bdd-4ad8-8cab-83375cabd30c req-6cd07d4f-b20b-43b7-aa95-7bad1e761715 service nova] Lock "3117e247-0538-4a30-a0d7-aa47247a6da1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 634.845138] env[59369]: DEBUG oslo_concurrency.lockutils [req-6b189862-6bdd-4ad8-8cab-83375cabd30c req-6cd07d4f-b20b-43b7-aa95-7bad1e761715 service nova] Lock "3117e247-0538-4a30-a0d7-aa47247a6da1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 634.845138] env[59369]: DEBUG nova.compute.manager [req-6b189862-6bdd-4ad8-8cab-83375cabd30c req-6cd07d4f-b20b-43b7-aa95-7bad1e761715 service nova] [instance: 3117e247-0538-4a30-a0d7-aa47247a6da1] No waiting events found dispatching network-vif-plugged-280d52b0-daa4-4c93-894d-c38fe3c24e33 {{(pid=59369) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 634.845138] env[59369]: WARNING nova.compute.manager [req-6b189862-6bdd-4ad8-8cab-83375cabd30c req-6cd07d4f-b20b-43b7-aa95-7bad1e761715 service nova] [instance: 3117e247-0538-4a30-a0d7-aa47247a6da1] Received unexpected event network-vif-plugged-280d52b0-daa4-4c93-894d-c38fe3c24e33 for instance with vm_state building and task_state spawning. [ 637.500826] env[59369]: DEBUG nova.compute.manager [req-11ddcaf7-8432-4699-ac41-37570cafc363 req-72d289e5-6cef-4d4b-ad8f-6693ffb67575 service nova] [instance: 3117e247-0538-4a30-a0d7-aa47247a6da1] Received event network-changed-280d52b0-daa4-4c93-894d-c38fe3c24e33 {{(pid=59369) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 637.501180] env[59369]: DEBUG nova.compute.manager [req-11ddcaf7-8432-4699-ac41-37570cafc363 req-72d289e5-6cef-4d4b-ad8f-6693ffb67575 service nova] [instance: 3117e247-0538-4a30-a0d7-aa47247a6da1] Refreshing instance network info cache due to event network-changed-280d52b0-daa4-4c93-894d-c38fe3c24e33. {{(pid=59369) external_instance_event /opt/stack/nova/nova/compute/manager.py:11007}} [ 637.503071] env[59369]: DEBUG oslo_concurrency.lockutils [req-11ddcaf7-8432-4699-ac41-37570cafc363 req-72d289e5-6cef-4d4b-ad8f-6693ffb67575 service nova] Acquiring lock "refresh_cache-3117e247-0538-4a30-a0d7-aa47247a6da1" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 637.503071] env[59369]: DEBUG oslo_concurrency.lockutils [req-11ddcaf7-8432-4699-ac41-37570cafc363 req-72d289e5-6cef-4d4b-ad8f-6693ffb67575 service nova] Acquired lock "refresh_cache-3117e247-0538-4a30-a0d7-aa47247a6da1" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 637.503071] env[59369]: DEBUG nova.network.neutron [req-11ddcaf7-8432-4699-ac41-37570cafc363 req-72d289e5-6cef-4d4b-ad8f-6693ffb67575 service nova] [instance: 3117e247-0538-4a30-a0d7-aa47247a6da1] Refreshing network info cache for port 280d52b0-daa4-4c93-894d-c38fe3c24e33 {{(pid=59369) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 638.756320] env[59369]: DEBUG nova.network.neutron [req-11ddcaf7-8432-4699-ac41-37570cafc363 req-72d289e5-6cef-4d4b-ad8f-6693ffb67575 service nova] [instance: 3117e247-0538-4a30-a0d7-aa47247a6da1] Updated VIF entry in instance network info cache for port 280d52b0-daa4-4c93-894d-c38fe3c24e33. {{(pid=59369) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 638.756597] env[59369]: DEBUG nova.network.neutron [req-11ddcaf7-8432-4699-ac41-37570cafc363 req-72d289e5-6cef-4d4b-ad8f-6693ffb67575 service nova] [instance: 3117e247-0538-4a30-a0d7-aa47247a6da1] Updating instance_info_cache with network_info: [{"id": "280d52b0-daa4-4c93-894d-c38fe3c24e33", "address": "fa:16:3e:e9:88:bf", "network": {"id": "b9eff4e8-f1ca-450e-848a-97a956388f68", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.97", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "7386534ab8334806bb50f77e095e084c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ad72c645-a67d-4efd-b563-28e44077e68d", "external-id": "nsx-vlan-transportzone-201", "segmentation_id": 201, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap280d52b0-da", "ovs_interfaceid": "280d52b0-daa4-4c93-894d-c38fe3c24e33", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59369) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 638.774882] env[59369]: DEBUG oslo_concurrency.lockutils [req-11ddcaf7-8432-4699-ac41-37570cafc363 req-72d289e5-6cef-4d4b-ad8f-6693ffb67575 service nova] Releasing lock "refresh_cache-3117e247-0538-4a30-a0d7-aa47247a6da1" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 638.774882] env[59369]: DEBUG nova.compute.manager [req-11ddcaf7-8432-4699-ac41-37570cafc363 req-72d289e5-6cef-4d4b-ad8f-6693ffb67575 service nova] [instance: f2673a5e-28b0-4a93-b93b-8eef64380e08] Received event network-vif-plugged-74fca338-207c-4385-9caf-13ceb8de1245 {{(pid=59369) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 638.774882] env[59369]: DEBUG oslo_concurrency.lockutils [req-11ddcaf7-8432-4699-ac41-37570cafc363 req-72d289e5-6cef-4d4b-ad8f-6693ffb67575 service nova] Acquiring lock "f2673a5e-28b0-4a93-b93b-8eef64380e08-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 638.774882] env[59369]: DEBUG oslo_concurrency.lockutils [req-11ddcaf7-8432-4699-ac41-37570cafc363 req-72d289e5-6cef-4d4b-ad8f-6693ffb67575 service nova] Lock "f2673a5e-28b0-4a93-b93b-8eef64380e08-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 638.775024] env[59369]: DEBUG oslo_concurrency.lockutils [req-11ddcaf7-8432-4699-ac41-37570cafc363 req-72d289e5-6cef-4d4b-ad8f-6693ffb67575 service nova] Lock "f2673a5e-28b0-4a93-b93b-8eef64380e08-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 638.775024] env[59369]: DEBUG nova.compute.manager [req-11ddcaf7-8432-4699-ac41-37570cafc363 req-72d289e5-6cef-4d4b-ad8f-6693ffb67575 service nova] [instance: f2673a5e-28b0-4a93-b93b-8eef64380e08] No waiting events found dispatching network-vif-plugged-74fca338-207c-4385-9caf-13ceb8de1245 {{(pid=59369) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 638.775092] env[59369]: WARNING nova.compute.manager [req-11ddcaf7-8432-4699-ac41-37570cafc363 req-72d289e5-6cef-4d4b-ad8f-6693ffb67575 service nova] [instance: f2673a5e-28b0-4a93-b93b-8eef64380e08] Received unexpected event network-vif-plugged-74fca338-207c-4385-9caf-13ceb8de1245 for instance with vm_state building and task_state spawning. [ 638.775781] env[59369]: DEBUG nova.compute.manager [req-11ddcaf7-8432-4699-ac41-37570cafc363 req-72d289e5-6cef-4d4b-ad8f-6693ffb67575 service nova] [instance: f2673a5e-28b0-4a93-b93b-8eef64380e08] Received event network-changed-74fca338-207c-4385-9caf-13ceb8de1245 {{(pid=59369) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 638.775781] env[59369]: DEBUG nova.compute.manager [req-11ddcaf7-8432-4699-ac41-37570cafc363 req-72d289e5-6cef-4d4b-ad8f-6693ffb67575 service nova] [instance: f2673a5e-28b0-4a93-b93b-8eef64380e08] Refreshing instance network info cache due to event network-changed-74fca338-207c-4385-9caf-13ceb8de1245. {{(pid=59369) external_instance_event /opt/stack/nova/nova/compute/manager.py:11007}} [ 638.775781] env[59369]: DEBUG oslo_concurrency.lockutils [req-11ddcaf7-8432-4699-ac41-37570cafc363 req-72d289e5-6cef-4d4b-ad8f-6693ffb67575 service nova] Acquiring lock "refresh_cache-f2673a5e-28b0-4a93-b93b-8eef64380e08" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 638.775781] env[59369]: DEBUG oslo_concurrency.lockutils [req-11ddcaf7-8432-4699-ac41-37570cafc363 req-72d289e5-6cef-4d4b-ad8f-6693ffb67575 service nova] Acquired lock "refresh_cache-f2673a5e-28b0-4a93-b93b-8eef64380e08" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 638.776037] env[59369]: DEBUG nova.network.neutron [req-11ddcaf7-8432-4699-ac41-37570cafc363 req-72d289e5-6cef-4d4b-ad8f-6693ffb67575 service nova] [instance: f2673a5e-28b0-4a93-b93b-8eef64380e08] Refreshing network info cache for port 74fca338-207c-4385-9caf-13ceb8de1245 {{(pid=59369) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 639.863499] env[59369]: DEBUG nova.network.neutron [req-11ddcaf7-8432-4699-ac41-37570cafc363 req-72d289e5-6cef-4d4b-ad8f-6693ffb67575 service nova] [instance: f2673a5e-28b0-4a93-b93b-8eef64380e08] Updated VIF entry in instance network info cache for port 74fca338-207c-4385-9caf-13ceb8de1245. {{(pid=59369) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 639.863499] env[59369]: DEBUG nova.network.neutron [req-11ddcaf7-8432-4699-ac41-37570cafc363 req-72d289e5-6cef-4d4b-ad8f-6693ffb67575 service nova] [instance: f2673a5e-28b0-4a93-b93b-8eef64380e08] Updating instance_info_cache with network_info: [{"id": "74fca338-207c-4385-9caf-13ceb8de1245", "address": "fa:16:3e:18:9e:cb", "network": {"id": "9d65de8c-f6d0-451e-8e87-e54b49363db9", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1811914259-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "6ae7d80622f8452ab598b3fb9c0d5730", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f66f8375-4460-4acd-987b-acda72bfcf0d", "external-id": "nsx-vlan-transportzone-533", "segmentation_id": 533, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap74fca338-20", "ovs_interfaceid": "74fca338-207c-4385-9caf-13ceb8de1245", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59369) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 639.875335] env[59369]: DEBUG oslo_concurrency.lockutils [req-11ddcaf7-8432-4699-ac41-37570cafc363 req-72d289e5-6cef-4d4b-ad8f-6693ffb67575 service nova] Releasing lock "refresh_cache-f2673a5e-28b0-4a93-b93b-8eef64380e08" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 656.128208] env[59369]: WARNING oslo_vmware.rw_handles [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 656.128208] env[59369]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 656.128208] env[59369]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 656.128208] env[59369]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 656.128208] env[59369]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 656.128208] env[59369]: ERROR oslo_vmware.rw_handles response.begin() [ 656.128208] env[59369]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 656.128208] env[59369]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 656.128208] env[59369]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 656.128208] env[59369]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 656.128208] env[59369]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 656.128208] env[59369]: ERROR oslo_vmware.rw_handles [ 656.128208] env[59369]: DEBUG nova.virt.vmwareapi.images [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] [instance: 31c17493-263e-4d08-b669-e901b07060d5] Downloaded image file data dc3bad91-06a4-485e-bdc4-b0135384e686 to vmware_temp/ce7bae6e-4193-4ad2-8925-c9c0abe682fa/dc3bad91-06a4-485e-bdc4-b0135384e686/tmp-sparse.vmdk on the data store datastore1 {{(pid=59369) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 656.130610] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] [instance: 31c17493-263e-4d08-b669-e901b07060d5] Caching image {{(pid=59369) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 656.130719] env[59369]: DEBUG nova.virt.vmwareapi.vm_util [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] Copying Virtual Disk [datastore1] vmware_temp/ce7bae6e-4193-4ad2-8925-c9c0abe682fa/dc3bad91-06a4-485e-bdc4-b0135384e686/tmp-sparse.vmdk to [datastore1] vmware_temp/ce7bae6e-4193-4ad2-8925-c9c0abe682fa/dc3bad91-06a4-485e-bdc4-b0135384e686/dc3bad91-06a4-485e-bdc4-b0135384e686.vmdk {{(pid=59369) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 656.132656] env[59369]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-2f897b5b-36db-4cef-bc98-c856037b0968 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 656.142430] env[59369]: DEBUG oslo_vmware.api [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] Waiting for the task: (returnval){ [ 656.142430] env[59369]: value = "task-463259" [ 656.142430] env[59369]: _type = "Task" [ 656.142430] env[59369]: } to complete. {{(pid=59369) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 656.154247] env[59369]: DEBUG oslo_vmware.api [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] Task: {'id': task-463259, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59369) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 656.658293] env[59369]: DEBUG oslo_vmware.exceptions [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] Fault InvalidArgument not matched. {{(pid=59369) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 656.658531] env[59369]: DEBUG oslo_concurrency.lockutils [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] Releasing lock "[datastore1] devstack-image-cache_base/dc3bad91-06a4-485e-bdc4-b0135384e686/dc3bad91-06a4-485e-bdc4-b0135384e686.vmdk" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 656.662384] env[59369]: ERROR nova.compute.manager [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] [instance: 31c17493-263e-4d08-b669-e901b07060d5] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 656.662384] env[59369]: Faults: ['InvalidArgument'] [ 656.662384] env[59369]: ERROR nova.compute.manager [instance: 31c17493-263e-4d08-b669-e901b07060d5] Traceback (most recent call last): [ 656.662384] env[59369]: ERROR nova.compute.manager [instance: 31c17493-263e-4d08-b669-e901b07060d5] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 656.662384] env[59369]: ERROR nova.compute.manager [instance: 31c17493-263e-4d08-b669-e901b07060d5] yield resources [ 656.662384] env[59369]: ERROR nova.compute.manager [instance: 31c17493-263e-4d08-b669-e901b07060d5] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 656.662384] env[59369]: ERROR nova.compute.manager [instance: 31c17493-263e-4d08-b669-e901b07060d5] self.driver.spawn(context, instance, image_meta, [ 656.662384] env[59369]: ERROR nova.compute.manager [instance: 31c17493-263e-4d08-b669-e901b07060d5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 656.662384] env[59369]: ERROR nova.compute.manager [instance: 31c17493-263e-4d08-b669-e901b07060d5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 656.662384] env[59369]: ERROR nova.compute.manager [instance: 31c17493-263e-4d08-b669-e901b07060d5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 656.662384] env[59369]: ERROR nova.compute.manager [instance: 31c17493-263e-4d08-b669-e901b07060d5] self._fetch_image_if_missing(context, vi) [ 656.662384] env[59369]: ERROR nova.compute.manager [instance: 31c17493-263e-4d08-b669-e901b07060d5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 656.662942] env[59369]: ERROR nova.compute.manager [instance: 31c17493-263e-4d08-b669-e901b07060d5] image_cache(vi, tmp_image_ds_loc) [ 656.662942] env[59369]: ERROR nova.compute.manager [instance: 31c17493-263e-4d08-b669-e901b07060d5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 656.662942] env[59369]: ERROR nova.compute.manager [instance: 31c17493-263e-4d08-b669-e901b07060d5] vm_util.copy_virtual_disk( [ 656.662942] env[59369]: ERROR nova.compute.manager [instance: 31c17493-263e-4d08-b669-e901b07060d5] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 656.662942] env[59369]: ERROR nova.compute.manager [instance: 31c17493-263e-4d08-b669-e901b07060d5] session._wait_for_task(vmdk_copy_task) [ 656.662942] env[59369]: ERROR nova.compute.manager [instance: 31c17493-263e-4d08-b669-e901b07060d5] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 656.662942] env[59369]: ERROR nova.compute.manager [instance: 31c17493-263e-4d08-b669-e901b07060d5] return self.wait_for_task(task_ref) [ 656.662942] env[59369]: ERROR nova.compute.manager [instance: 31c17493-263e-4d08-b669-e901b07060d5] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 656.662942] env[59369]: ERROR nova.compute.manager [instance: 31c17493-263e-4d08-b669-e901b07060d5] return evt.wait() [ 656.662942] env[59369]: ERROR nova.compute.manager [instance: 31c17493-263e-4d08-b669-e901b07060d5] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 656.662942] env[59369]: ERROR nova.compute.manager [instance: 31c17493-263e-4d08-b669-e901b07060d5] result = hub.switch() [ 656.662942] env[59369]: ERROR nova.compute.manager [instance: 31c17493-263e-4d08-b669-e901b07060d5] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 656.662942] env[59369]: ERROR nova.compute.manager [instance: 31c17493-263e-4d08-b669-e901b07060d5] return self.greenlet.switch() [ 656.663634] env[59369]: ERROR nova.compute.manager [instance: 31c17493-263e-4d08-b669-e901b07060d5] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 656.663634] env[59369]: ERROR nova.compute.manager [instance: 31c17493-263e-4d08-b669-e901b07060d5] self.f(*self.args, **self.kw) [ 656.663634] env[59369]: ERROR nova.compute.manager [instance: 31c17493-263e-4d08-b669-e901b07060d5] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 656.663634] env[59369]: ERROR nova.compute.manager [instance: 31c17493-263e-4d08-b669-e901b07060d5] raise exceptions.translate_fault(task_info.error) [ 656.663634] env[59369]: ERROR nova.compute.manager [instance: 31c17493-263e-4d08-b669-e901b07060d5] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 656.663634] env[59369]: ERROR nova.compute.manager [instance: 31c17493-263e-4d08-b669-e901b07060d5] Faults: ['InvalidArgument'] [ 656.663634] env[59369]: ERROR nova.compute.manager [instance: 31c17493-263e-4d08-b669-e901b07060d5] [ 656.663634] env[59369]: INFO nova.compute.manager [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] [instance: 31c17493-263e-4d08-b669-e901b07060d5] Terminating instance [ 656.664111] env[59369]: DEBUG oslo_concurrency.lockutils [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] Acquired lock "[datastore1] devstack-image-cache_base/dc3bad91-06a4-485e-bdc4-b0135384e686/dc3bad91-06a4-485e-bdc4-b0135384e686.vmdk" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 656.664293] env[59369]: DEBUG nova.virt.vmwareapi.ds_util [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59369) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 656.664907] env[59369]: DEBUG nova.compute.manager [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] [instance: 31c17493-263e-4d08-b669-e901b07060d5] Start destroying the instance on the hypervisor. {{(pid=59369) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 656.665100] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] [instance: 31c17493-263e-4d08-b669-e901b07060d5] Destroying instance {{(pid=59369) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 656.665738] env[59369]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c5b94aa3-2be9-43e0-bfe3-4009ecb7622c {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 656.668954] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-37e6ed7d-7c8f-4a21-9cee-a53d73b33d18 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 656.678089] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] [instance: 31c17493-263e-4d08-b669-e901b07060d5] Unregistering the VM {{(pid=59369) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 656.678800] env[59369]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-79ac6c4c-21b8-41f2-8bb8-17a17d04b1b1 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 656.680637] env[59369]: DEBUG nova.virt.vmwareapi.ds_util [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59369) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 656.680637] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59369) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 656.681159] env[59369]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6d9e9ce8-cb4e-4abe-ac9a-4127cb87d66d {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 656.687508] env[59369]: DEBUG oslo_vmware.api [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] Waiting for the task: (returnval){ [ 656.687508] env[59369]: value = "session[52794835-9974-5e05-43d4-e917bd2adbc9]527ff07f-14dc-48c6-59f3-7d019af718d8" [ 656.687508] env[59369]: _type = "Task" [ 656.687508] env[59369]: } to complete. {{(pid=59369) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 656.696263] env[59369]: DEBUG oslo_vmware.api [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] Task: {'id': session[52794835-9974-5e05-43d4-e917bd2adbc9]527ff07f-14dc-48c6-59f3-7d019af718d8, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59369) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 656.761562] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] [instance: 31c17493-263e-4d08-b669-e901b07060d5] Unregistered the VM {{(pid=59369) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 656.761790] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] [instance: 31c17493-263e-4d08-b669-e901b07060d5] Deleting contents of the VM from datastore datastore1 {{(pid=59369) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 656.761975] env[59369]: DEBUG nova.virt.vmwareapi.ds_util [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] Deleting the datastore file [datastore1] 31c17493-263e-4d08-b669-e901b07060d5 {{(pid=59369) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 656.762228] env[59369]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-c11e30e8-f1c8-4e30-a397-e25fd03ce33a {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 656.772718] env[59369]: DEBUG oslo_vmware.api [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] Waiting for the task: (returnval){ [ 656.772718] env[59369]: value = "task-463261" [ 656.772718] env[59369]: _type = "Task" [ 656.772718] env[59369]: } to complete. {{(pid=59369) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 656.781943] env[59369]: DEBUG oslo_vmware.api [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] Task: {'id': task-463261, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59369) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 657.205046] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] Preparing fetch location {{(pid=59369) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 657.205046] env[59369]: DEBUG nova.virt.vmwareapi.ds_util [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] Creating directory with path [datastore1] vmware_temp/4e685b79-4225-4aa4-b66b-439c73f57f65/dc3bad91-06a4-485e-bdc4-b0135384e686 {{(pid=59369) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 657.205404] env[59369]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-9af36b8b-c931-4c59-9cfa-639015233373 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 657.221045] env[59369]: DEBUG nova.virt.vmwareapi.ds_util [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] Created directory with path [datastore1] vmware_temp/4e685b79-4225-4aa4-b66b-439c73f57f65/dc3bad91-06a4-485e-bdc4-b0135384e686 {{(pid=59369) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 657.221045] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] Fetch image to [datastore1] vmware_temp/4e685b79-4225-4aa4-b66b-439c73f57f65/dc3bad91-06a4-485e-bdc4-b0135384e686/tmp-sparse.vmdk {{(pid=59369) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 657.221418] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] Downloading image file data dc3bad91-06a4-485e-bdc4-b0135384e686 to [datastore1] vmware_temp/4e685b79-4225-4aa4-b66b-439c73f57f65/dc3bad91-06a4-485e-bdc4-b0135384e686/tmp-sparse.vmdk on the data store datastore1 {{(pid=59369) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 657.222096] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e6bceea6-87d8-4181-a3a7-229f758ffe2b {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 657.235317] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ad929be4-6132-42e0-b71f-cddf8b33e3aa {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 657.249309] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a3ce99c9-be3f-41a2-8b0a-f097fa7019eb {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 657.290348] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-45c4dcd7-620c-43d7-a63d-df4b371abee1 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 657.301423] env[59369]: DEBUG oslo_vmware.api [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] Task: {'id': task-463261, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.071402} completed successfully. {{(pid=59369) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 657.301423] env[59369]: DEBUG nova.virt.vmwareapi.ds_util [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] Deleted the datastore file {{(pid=59369) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 657.301423] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] [instance: 31c17493-263e-4d08-b669-e901b07060d5] Deleted contents of the VM from datastore datastore1 {{(pid=59369) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 657.301423] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] [instance: 31c17493-263e-4d08-b669-e901b07060d5] Instance destroyed {{(pid=59369) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 657.301423] env[59369]: INFO nova.compute.manager [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] [instance: 31c17493-263e-4d08-b669-e901b07060d5] Took 0.64 seconds to destroy the instance on the hypervisor. [ 657.303048] env[59369]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-98e6551a-c1d5-4902-bb2f-b41fb8f1c4d2 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 657.305383] env[59369]: DEBUG nova.compute.claims [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] [instance: 31c17493-263e-4d08-b669-e901b07060d5] Aborting claim: {{(pid=59369) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 657.305546] env[59369]: DEBUG oslo_concurrency.lockutils [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 657.305742] env[59369]: DEBUG oslo_concurrency.lockutils [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 657.341357] env[59369]: DEBUG nova.virt.vmwareapi.images [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] Downloading image file data dc3bad91-06a4-485e-bdc4-b0135384e686 to the data store datastore1 {{(pid=59369) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 657.409209] env[59369]: DEBUG oslo_vmware.rw_handles [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/4e685b79-4225-4aa4-b66b-439c73f57f65/dc3bad91-06a4-485e-bdc4-b0135384e686/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59369) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 657.470035] env[59369]: DEBUG oslo_vmware.rw_handles [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] Completed reading data from the image iterator. {{(pid=59369) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 657.470035] env[59369]: DEBUG oslo_vmware.rw_handles [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/4e685b79-4225-4aa4-b66b-439c73f57f65/dc3bad91-06a4-485e-bdc4-b0135384e686/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59369) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 657.573511] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5eeee40d-f5f7-4e40-a88b-603ff7ce5eeb {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 657.587133] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3f5eea79-2a0d-417c-a44e-04071254eeba {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 657.628021] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-53173e85-fc0d-4f7f-a0bf-d05d3b9f8ce2 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 657.638825] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0a0fb944-3340-4c71-9374-080154fa6380 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 657.657033] env[59369]: DEBUG nova.compute.provider_tree [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] Inventory has not changed in ProviderTree for provider: 6ace6145-3535-4f74-aa29-80f64a201369 {{(pid=59369) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 657.671966] env[59369]: DEBUG nova.scheduler.client.report [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] Inventory has not changed for provider 6ace6145-3535-4f74-aa29-80f64a201369 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 116, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59369) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 657.697820] env[59369]: DEBUG oslo_concurrency.lockutils [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.391s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 657.697820] env[59369]: ERROR nova.compute.manager [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] [instance: 31c17493-263e-4d08-b669-e901b07060d5] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 657.697820] env[59369]: Faults: ['InvalidArgument'] [ 657.697820] env[59369]: ERROR nova.compute.manager [instance: 31c17493-263e-4d08-b669-e901b07060d5] Traceback (most recent call last): [ 657.697820] env[59369]: ERROR nova.compute.manager [instance: 31c17493-263e-4d08-b669-e901b07060d5] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 657.697820] env[59369]: ERROR nova.compute.manager [instance: 31c17493-263e-4d08-b669-e901b07060d5] self.driver.spawn(context, instance, image_meta, [ 657.697820] env[59369]: ERROR nova.compute.manager [instance: 31c17493-263e-4d08-b669-e901b07060d5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 657.697820] env[59369]: ERROR nova.compute.manager [instance: 31c17493-263e-4d08-b669-e901b07060d5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 657.697820] env[59369]: ERROR nova.compute.manager [instance: 31c17493-263e-4d08-b669-e901b07060d5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 657.697820] env[59369]: ERROR nova.compute.manager [instance: 31c17493-263e-4d08-b669-e901b07060d5] self._fetch_image_if_missing(context, vi) [ 657.698269] env[59369]: ERROR nova.compute.manager [instance: 31c17493-263e-4d08-b669-e901b07060d5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 657.698269] env[59369]: ERROR nova.compute.manager [instance: 31c17493-263e-4d08-b669-e901b07060d5] image_cache(vi, tmp_image_ds_loc) [ 657.698269] env[59369]: ERROR nova.compute.manager [instance: 31c17493-263e-4d08-b669-e901b07060d5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 657.698269] env[59369]: ERROR nova.compute.manager [instance: 31c17493-263e-4d08-b669-e901b07060d5] vm_util.copy_virtual_disk( [ 657.698269] env[59369]: ERROR nova.compute.manager [instance: 31c17493-263e-4d08-b669-e901b07060d5] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 657.698269] env[59369]: ERROR nova.compute.manager [instance: 31c17493-263e-4d08-b669-e901b07060d5] session._wait_for_task(vmdk_copy_task) [ 657.698269] env[59369]: ERROR nova.compute.manager [instance: 31c17493-263e-4d08-b669-e901b07060d5] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 657.698269] env[59369]: ERROR nova.compute.manager [instance: 31c17493-263e-4d08-b669-e901b07060d5] return self.wait_for_task(task_ref) [ 657.698269] env[59369]: ERROR nova.compute.manager [instance: 31c17493-263e-4d08-b669-e901b07060d5] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 657.698269] env[59369]: ERROR nova.compute.manager [instance: 31c17493-263e-4d08-b669-e901b07060d5] return evt.wait() [ 657.698269] env[59369]: ERROR nova.compute.manager [instance: 31c17493-263e-4d08-b669-e901b07060d5] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 657.698269] env[59369]: ERROR nova.compute.manager [instance: 31c17493-263e-4d08-b669-e901b07060d5] result = hub.switch() [ 657.698269] env[59369]: ERROR nova.compute.manager [instance: 31c17493-263e-4d08-b669-e901b07060d5] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 657.698605] env[59369]: ERROR nova.compute.manager [instance: 31c17493-263e-4d08-b669-e901b07060d5] return self.greenlet.switch() [ 657.698605] env[59369]: ERROR nova.compute.manager [instance: 31c17493-263e-4d08-b669-e901b07060d5] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 657.698605] env[59369]: ERROR nova.compute.manager [instance: 31c17493-263e-4d08-b669-e901b07060d5] self.f(*self.args, **self.kw) [ 657.698605] env[59369]: ERROR nova.compute.manager [instance: 31c17493-263e-4d08-b669-e901b07060d5] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 657.698605] env[59369]: ERROR nova.compute.manager [instance: 31c17493-263e-4d08-b669-e901b07060d5] raise exceptions.translate_fault(task_info.error) [ 657.698605] env[59369]: ERROR nova.compute.manager [instance: 31c17493-263e-4d08-b669-e901b07060d5] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 657.698605] env[59369]: ERROR nova.compute.manager [instance: 31c17493-263e-4d08-b669-e901b07060d5] Faults: ['InvalidArgument'] [ 657.698605] env[59369]: ERROR nova.compute.manager [instance: 31c17493-263e-4d08-b669-e901b07060d5] [ 657.698605] env[59369]: DEBUG nova.compute.utils [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] [instance: 31c17493-263e-4d08-b669-e901b07060d5] VimFaultException {{(pid=59369) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 657.715103] env[59369]: DEBUG nova.compute.manager [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] [instance: 31c17493-263e-4d08-b669-e901b07060d5] Build of instance 31c17493-263e-4d08-b669-e901b07060d5 was re-scheduled: A specified parameter was not correct: fileType [ 657.715103] env[59369]: Faults: ['InvalidArgument'] {{(pid=59369) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 657.716810] env[59369]: DEBUG nova.compute.manager [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] [instance: 31c17493-263e-4d08-b669-e901b07060d5] Unplugging VIFs for instance {{(pid=59369) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 657.716810] env[59369]: DEBUG nova.compute.manager [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59369) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 657.716810] env[59369]: DEBUG nova.compute.manager [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] [instance: 31c17493-263e-4d08-b669-e901b07060d5] Deallocating network for instance {{(pid=59369) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 657.716810] env[59369]: DEBUG nova.network.neutron [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] [instance: 31c17493-263e-4d08-b669-e901b07060d5] deallocate_for_instance() {{(pid=59369) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 658.370979] env[59369]: DEBUG nova.network.neutron [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] [instance: 31c17493-263e-4d08-b669-e901b07060d5] Updating instance_info_cache with network_info: [] {{(pid=59369) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 658.393833] env[59369]: INFO nova.compute.manager [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] [instance: 31c17493-263e-4d08-b669-e901b07060d5] Took 0.68 seconds to deallocate network for instance. [ 658.498771] env[59369]: INFO nova.scheduler.client.report [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] Deleted allocations for instance 31c17493-263e-4d08-b669-e901b07060d5 [ 658.522328] env[59369]: DEBUG oslo_concurrency.lockutils [None req-83772ab8-3302-4512-a105-2d5417c59aa3 tempest-FloatingIPsAssociationTestJSON-1150651661 tempest-FloatingIPsAssociationTestJSON-1150651661-project-member] Lock "31c17493-263e-4d08-b669-e901b07060d5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 58.429s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 658.522567] env[59369]: DEBUG oslo_concurrency.lockutils [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Lock "31c17493-263e-4d08-b669-e901b07060d5" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 48.048s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 658.522746] env[59369]: INFO nova.compute.manager [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] [instance: 31c17493-263e-4d08-b669-e901b07060d5] During sync_power_state the instance has a pending task (spawning). Skip. [ 658.522941] env[59369]: DEBUG oslo_concurrency.lockutils [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Lock "31c17493-263e-4d08-b669-e901b07060d5" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 675.712952] env[59369]: DEBUG oslo_service.periodic_task [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=59369) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 675.748602] env[59369]: DEBUG oslo_service.periodic_task [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59369) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 675.750550] env[59369]: DEBUG oslo_service.periodic_task [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59369) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 675.750550] env[59369]: DEBUG oslo_service.periodic_task [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59369) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 675.792605] env[59369]: DEBUG oslo_concurrency.lockutils [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 675.792605] env[59369]: DEBUG oslo_concurrency.lockutils [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 675.792605] env[59369]: DEBUG oslo_concurrency.lockutils [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 675.792605] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59369) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 675.792605] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4f350ee7-f444-499a-bfbd-2e57bc24e7ff {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 675.797933] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dd124b3e-6ea8-405b-b072-8db3112b66b3 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 675.830142] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c56e5fdf-b5e7-4d49-91a2-f7d2a42bc165 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 675.850636] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d52bc454-249b-45a8-92ea-11dc374baf99 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 675.890113] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181786MB free_disk=116GB free_vcpus=48 pci_devices=None {{(pid=59369) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 675.890289] env[59369]: DEBUG oslo_concurrency.lockutils [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 675.890483] env[59369]: DEBUG oslo_concurrency.lockutils [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 676.000217] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Instance 02929ad5-b2f4-4a80-a606-2a0c9b6222e4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59369) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 676.000319] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Instance b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59369) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 676.000719] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Instance eae87f89-8488-42e6-b065-1198bfbe8177 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59369) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 676.000719] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Instance d501202a-354c-42e1-8480-f026d5216a58 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59369) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 676.000719] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Instance ad662e06-1a0f-4110-8d23-8ff6c6889eee actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59369) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 676.000898] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Instance 3089e10b-f9fd-4049-b8f4-9297fe6a7c86 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59369) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 676.000898] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Instance 3117e247-0538-4a30-a0d7-aa47247a6da1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59369) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 676.001036] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Instance f2673a5e-28b0-4a93-b93b-8eef64380e08 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59369) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 676.001249] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Total usable vcpus: 48, total allocated vcpus: 8 {{(pid=59369) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 676.001354] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1536MB phys_disk=200GB used_disk=8GB total_vcpus=48 used_vcpus=8 pci_stats=[] {{(pid=59369) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 676.162239] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-99d06c55-8ab9-4c9f-910c-1e2f5713b992 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 676.172245] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-48c37564-7633-4dff-915e-9edc8eef6b86 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 676.207033] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-245c3f22-5958-4f07-ad03-f5d0701c4a3c {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 676.219431] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-673b1c11-9a4a-4a31-beff-85747f5d058f {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 676.237062] env[59369]: DEBUG nova.compute.provider_tree [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Inventory has not changed in ProviderTree for provider: 6ace6145-3535-4f74-aa29-80f64a201369 {{(pid=59369) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 676.247705] env[59369]: DEBUG nova.scheduler.client.report [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Inventory has not changed for provider 6ace6145-3535-4f74-aa29-80f64a201369 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 116, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59369) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 676.262378] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59369) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 676.262520] env[59369]: DEBUG oslo_concurrency.lockutils [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.372s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 676.769781] env[59369]: DEBUG oslo_service.periodic_task [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59369) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 676.770060] env[59369]: DEBUG nova.compute.manager [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Starting heal instance info cache {{(pid=59369) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9812}} [ 676.770095] env[59369]: DEBUG nova.compute.manager [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Rebuilding the list of instances to heal {{(pid=59369) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9816}} [ 676.798646] env[59369]: DEBUG nova.compute.manager [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] Skipping network cache update for instance because it is Building. {{(pid=59369) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 676.798646] env[59369]: DEBUG nova.compute.manager [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] Skipping network cache update for instance because it is Building. {{(pid=59369) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 676.798646] env[59369]: DEBUG nova.compute.manager [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] [instance: eae87f89-8488-42e6-b065-1198bfbe8177] Skipping network cache update for instance because it is Building. {{(pid=59369) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 676.798785] env[59369]: DEBUG nova.compute.manager [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] [instance: d501202a-354c-42e1-8480-f026d5216a58] Skipping network cache update for instance because it is Building. {{(pid=59369) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 676.798857] env[59369]: DEBUG nova.compute.manager [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] [instance: ad662e06-1a0f-4110-8d23-8ff6c6889eee] Skipping network cache update for instance because it is Building. {{(pid=59369) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 676.798970] env[59369]: DEBUG nova.compute.manager [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] [instance: 3089e10b-f9fd-4049-b8f4-9297fe6a7c86] Skipping network cache update for instance because it is Building. {{(pid=59369) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 676.799094] env[59369]: DEBUG nova.compute.manager [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] [instance: 3117e247-0538-4a30-a0d7-aa47247a6da1] Skipping network cache update for instance because it is Building. {{(pid=59369) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 676.799205] env[59369]: DEBUG nova.compute.manager [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] [instance: f2673a5e-28b0-4a93-b93b-8eef64380e08] Skipping network cache update for instance because it is Building. {{(pid=59369) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 676.799471] env[59369]: DEBUG nova.compute.manager [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Didn't find any instances for network info cache update. {{(pid=59369) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9898}} [ 676.800074] env[59369]: DEBUG oslo_service.periodic_task [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59369) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 676.800251] env[59369]: DEBUG oslo_service.periodic_task [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59369) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 676.800398] env[59369]: DEBUG oslo_service.periodic_task [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59369) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 676.800528] env[59369]: DEBUG nova.compute.manager [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59369) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10431}} [ 677.258404] env[59369]: DEBUG oslo_service.periodic_task [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59369) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 677.258603] env[59369]: DEBUG oslo_service.periodic_task [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59369) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 686.028416] env[59369]: DEBUG oslo_concurrency.lockutils [None req-3b1eb679-35b4-4cf7-a1c9-cb785ca4878f tempest-FloatingIPsAssociationNegativeTestJSON-821667375 tempest-FloatingIPsAssociationNegativeTestJSON-821667375-project-member] Acquiring lock "8960dfc9-2908-4113-bebd-3a045d3e135c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 686.028782] env[59369]: DEBUG oslo_concurrency.lockutils [None req-3b1eb679-35b4-4cf7-a1c9-cb785ca4878f tempest-FloatingIPsAssociationNegativeTestJSON-821667375 tempest-FloatingIPsAssociationNegativeTestJSON-821667375-project-member] Lock "8960dfc9-2908-4113-bebd-3a045d3e135c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 686.041721] env[59369]: DEBUG nova.compute.manager [None req-3b1eb679-35b4-4cf7-a1c9-cb785ca4878f tempest-FloatingIPsAssociationNegativeTestJSON-821667375 tempest-FloatingIPsAssociationNegativeTestJSON-821667375-project-member] [instance: 8960dfc9-2908-4113-bebd-3a045d3e135c] Starting instance... {{(pid=59369) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 686.095954] env[59369]: DEBUG oslo_concurrency.lockutils [None req-3b1eb679-35b4-4cf7-a1c9-cb785ca4878f tempest-FloatingIPsAssociationNegativeTestJSON-821667375 tempest-FloatingIPsAssociationNegativeTestJSON-821667375-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 686.096209] env[59369]: DEBUG oslo_concurrency.lockutils [None req-3b1eb679-35b4-4cf7-a1c9-cb785ca4878f tempest-FloatingIPsAssociationNegativeTestJSON-821667375 tempest-FloatingIPsAssociationNegativeTestJSON-821667375-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 686.097695] env[59369]: INFO nova.compute.claims [None req-3b1eb679-35b4-4cf7-a1c9-cb785ca4878f tempest-FloatingIPsAssociationNegativeTestJSON-821667375 tempest-FloatingIPsAssociationNegativeTestJSON-821667375-project-member] [instance: 8960dfc9-2908-4113-bebd-3a045d3e135c] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 686.313302] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-df795586-68bb-430c-9516-1b3cb06b4f29 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 686.323784] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3a5a1e61-7060-4ba8-af92-3c3102877d83 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 686.364481] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c52039a4-c1fe-40db-9951-57d54bf7b528 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 686.373659] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-16dd5965-9e81-4bdd-9c4a-fd3b68152370 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 686.389715] env[59369]: DEBUG nova.compute.provider_tree [None req-3b1eb679-35b4-4cf7-a1c9-cb785ca4878f tempest-FloatingIPsAssociationNegativeTestJSON-821667375 tempest-FloatingIPsAssociationNegativeTestJSON-821667375-project-member] Inventory has not changed in ProviderTree for provider: 6ace6145-3535-4f74-aa29-80f64a201369 {{(pid=59369) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 686.407021] env[59369]: DEBUG nova.scheduler.client.report [None req-3b1eb679-35b4-4cf7-a1c9-cb785ca4878f tempest-FloatingIPsAssociationNegativeTestJSON-821667375 tempest-FloatingIPsAssociationNegativeTestJSON-821667375-project-member] Inventory has not changed for provider 6ace6145-3535-4f74-aa29-80f64a201369 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 116, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59369) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 686.437905] env[59369]: DEBUG oslo_concurrency.lockutils [None req-3b1eb679-35b4-4cf7-a1c9-cb785ca4878f tempest-FloatingIPsAssociationNegativeTestJSON-821667375 tempest-FloatingIPsAssociationNegativeTestJSON-821667375-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.342s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 686.439338] env[59369]: DEBUG nova.compute.manager [None req-3b1eb679-35b4-4cf7-a1c9-cb785ca4878f tempest-FloatingIPsAssociationNegativeTestJSON-821667375 tempest-FloatingIPsAssociationNegativeTestJSON-821667375-project-member] [instance: 8960dfc9-2908-4113-bebd-3a045d3e135c] Start building networks asynchronously for instance. {{(pid=59369) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 686.512505] env[59369]: DEBUG nova.compute.utils [None req-3b1eb679-35b4-4cf7-a1c9-cb785ca4878f tempest-FloatingIPsAssociationNegativeTestJSON-821667375 tempest-FloatingIPsAssociationNegativeTestJSON-821667375-project-member] Using /dev/sd instead of None {{(pid=59369) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 686.516758] env[59369]: DEBUG nova.compute.manager [None req-3b1eb679-35b4-4cf7-a1c9-cb785ca4878f tempest-FloatingIPsAssociationNegativeTestJSON-821667375 tempest-FloatingIPsAssociationNegativeTestJSON-821667375-project-member] [instance: 8960dfc9-2908-4113-bebd-3a045d3e135c] Allocating IP information in the background. {{(pid=59369) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 686.519737] env[59369]: DEBUG nova.network.neutron [None req-3b1eb679-35b4-4cf7-a1c9-cb785ca4878f tempest-FloatingIPsAssociationNegativeTestJSON-821667375 tempest-FloatingIPsAssociationNegativeTestJSON-821667375-project-member] [instance: 8960dfc9-2908-4113-bebd-3a045d3e135c] allocate_for_instance() {{(pid=59369) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 686.535948] env[59369]: DEBUG nova.compute.manager [None req-3b1eb679-35b4-4cf7-a1c9-cb785ca4878f tempest-FloatingIPsAssociationNegativeTestJSON-821667375 tempest-FloatingIPsAssociationNegativeTestJSON-821667375-project-member] [instance: 8960dfc9-2908-4113-bebd-3a045d3e135c] Start building block device mappings for instance. {{(pid=59369) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 686.657367] env[59369]: DEBUG nova.compute.manager [None req-3b1eb679-35b4-4cf7-a1c9-cb785ca4878f tempest-FloatingIPsAssociationNegativeTestJSON-821667375 tempest-FloatingIPsAssociationNegativeTestJSON-821667375-project-member] [instance: 8960dfc9-2908-4113-bebd-3a045d3e135c] Start spawning the instance on the hypervisor. {{(pid=59369) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 686.681731] env[59369]: DEBUG nova.virt.hardware [None req-3b1eb679-35b4-4cf7-a1c9-cb785ca4878f tempest-FloatingIPsAssociationNegativeTestJSON-821667375 tempest-FloatingIPsAssociationNegativeTestJSON-821667375-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-08-29T11:50:58Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-08-29T11:50:41Z,direct_url=,disk_format='vmdk',id=dc3bad91-06a4-485e-bdc4-b0135384e686,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='7386534ab8334806bb50f77e095e084c',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-08-29T11:50:42Z,virtual_size=,visibility=), allow threads: False {{(pid=59369) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 686.681980] env[59369]: DEBUG nova.virt.hardware [None req-3b1eb679-35b4-4cf7-a1c9-cb785ca4878f tempest-FloatingIPsAssociationNegativeTestJSON-821667375 tempest-FloatingIPsAssociationNegativeTestJSON-821667375-project-member] Flavor limits 0:0:0 {{(pid=59369) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 686.682153] env[59369]: DEBUG nova.virt.hardware [None req-3b1eb679-35b4-4cf7-a1c9-cb785ca4878f tempest-FloatingIPsAssociationNegativeTestJSON-821667375 tempest-FloatingIPsAssociationNegativeTestJSON-821667375-project-member] Image limits 0:0:0 {{(pid=59369) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 686.682335] env[59369]: DEBUG nova.virt.hardware [None req-3b1eb679-35b4-4cf7-a1c9-cb785ca4878f tempest-FloatingIPsAssociationNegativeTestJSON-821667375 tempest-FloatingIPsAssociationNegativeTestJSON-821667375-project-member] Flavor pref 0:0:0 {{(pid=59369) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 686.682478] env[59369]: DEBUG nova.virt.hardware [None req-3b1eb679-35b4-4cf7-a1c9-cb785ca4878f tempest-FloatingIPsAssociationNegativeTestJSON-821667375 tempest-FloatingIPsAssociationNegativeTestJSON-821667375-project-member] Image pref 0:0:0 {{(pid=59369) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 686.682620] env[59369]: DEBUG nova.virt.hardware [None req-3b1eb679-35b4-4cf7-a1c9-cb785ca4878f tempest-FloatingIPsAssociationNegativeTestJSON-821667375 tempest-FloatingIPsAssociationNegativeTestJSON-821667375-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59369) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 686.682823] env[59369]: DEBUG nova.virt.hardware [None req-3b1eb679-35b4-4cf7-a1c9-cb785ca4878f tempest-FloatingIPsAssociationNegativeTestJSON-821667375 tempest-FloatingIPsAssociationNegativeTestJSON-821667375-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59369) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 686.682979] env[59369]: DEBUG nova.virt.hardware [None req-3b1eb679-35b4-4cf7-a1c9-cb785ca4878f tempest-FloatingIPsAssociationNegativeTestJSON-821667375 tempest-FloatingIPsAssociationNegativeTestJSON-821667375-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59369) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 686.683250] env[59369]: DEBUG nova.virt.hardware [None req-3b1eb679-35b4-4cf7-a1c9-cb785ca4878f tempest-FloatingIPsAssociationNegativeTestJSON-821667375 tempest-FloatingIPsAssociationNegativeTestJSON-821667375-project-member] Got 1 possible topologies {{(pid=59369) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 686.683436] env[59369]: DEBUG nova.virt.hardware [None req-3b1eb679-35b4-4cf7-a1c9-cb785ca4878f tempest-FloatingIPsAssociationNegativeTestJSON-821667375 tempest-FloatingIPsAssociationNegativeTestJSON-821667375-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59369) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 686.683612] env[59369]: DEBUG nova.virt.hardware [None req-3b1eb679-35b4-4cf7-a1c9-cb785ca4878f tempest-FloatingIPsAssociationNegativeTestJSON-821667375 tempest-FloatingIPsAssociationNegativeTestJSON-821667375-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59369) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 686.684504] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e1079a58-00f9-48ab-86d6-c198fd10c775 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 686.695277] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-506bf4f3-d731-44f9-9004-ea96d98636aa {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 686.743576] env[59369]: DEBUG oslo_concurrency.lockutils [None req-4803dc67-a3e4-4ac4-af85-869de62e91a4 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Acquiring lock "fe4378ed-29fd-4b4e-8943-80c833830357" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 686.743782] env[59369]: DEBUG oslo_concurrency.lockutils [None req-4803dc67-a3e4-4ac4-af85-869de62e91a4 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Lock "fe4378ed-29fd-4b4e-8943-80c833830357" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 686.755704] env[59369]: DEBUG nova.compute.manager [None req-4803dc67-a3e4-4ac4-af85-869de62e91a4 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Starting instance... {{(pid=59369) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 686.806100] env[59369]: DEBUG oslo_concurrency.lockutils [None req-4803dc67-a3e4-4ac4-af85-869de62e91a4 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 686.806376] env[59369]: DEBUG oslo_concurrency.lockutils [None req-4803dc67-a3e4-4ac4-af85-869de62e91a4 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 686.807985] env[59369]: INFO nova.compute.claims [None req-4803dc67-a3e4-4ac4-af85-869de62e91a4 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 686.987948] env[59369]: DEBUG nova.policy [None req-3b1eb679-35b4-4cf7-a1c9-cb785ca4878f tempest-FloatingIPsAssociationNegativeTestJSON-821667375 tempest-FloatingIPsAssociationNegativeTestJSON-821667375-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a65b78a0916c4cc2a46b4344e73a89bd', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9ca16ba07aa94c24be48f9d25258d2aa', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59369) authorize /opt/stack/nova/nova/policy.py:203}} [ 687.033710] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3029dec6-9d2f-408b-a7a2-b15caf3fea3f {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 687.043986] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3a17b447-b18c-4ed4-91ad-011b45de3aa7 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 687.079589] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ba729ef8-1e96-4e08-9cd0-c8c456c30695 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 687.091736] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-77b8ffd0-5720-4668-bc36-5db7dcf39d23 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 687.107683] env[59369]: DEBUG nova.compute.provider_tree [None req-4803dc67-a3e4-4ac4-af85-869de62e91a4 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Inventory has not changed in ProviderTree for provider: 6ace6145-3535-4f74-aa29-80f64a201369 {{(pid=59369) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 687.118789] env[59369]: DEBUG nova.scheduler.client.report [None req-4803dc67-a3e4-4ac4-af85-869de62e91a4 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Inventory has not changed for provider 6ace6145-3535-4f74-aa29-80f64a201369 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 116, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59369) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 687.134426] env[59369]: DEBUG oslo_concurrency.lockutils [None req-4803dc67-a3e4-4ac4-af85-869de62e91a4 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.328s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 687.134789] env[59369]: DEBUG nova.compute.manager [None req-4803dc67-a3e4-4ac4-af85-869de62e91a4 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Start building networks asynchronously for instance. {{(pid=59369) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 687.178592] env[59369]: DEBUG nova.compute.utils [None req-4803dc67-a3e4-4ac4-af85-869de62e91a4 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Using /dev/sd instead of None {{(pid=59369) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 687.180011] env[59369]: DEBUG nova.compute.manager [None req-4803dc67-a3e4-4ac4-af85-869de62e91a4 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Allocating IP information in the background. {{(pid=59369) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 687.180315] env[59369]: DEBUG nova.network.neutron [None req-4803dc67-a3e4-4ac4-af85-869de62e91a4 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] allocate_for_instance() {{(pid=59369) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 687.196235] env[59369]: DEBUG nova.compute.manager [None req-4803dc67-a3e4-4ac4-af85-869de62e91a4 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Start building block device mappings for instance. {{(pid=59369) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 687.242918] env[59369]: INFO nova.virt.block_device [None req-4803dc67-a3e4-4ac4-af85-869de62e91a4 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Booting with volume 3a1b2e8d-d5a5-493f-907f-3cfc470a56e3 at /dev/sda [ 687.301558] env[59369]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-fb6ee14a-f15a-4d2b-bbd3-9f1cd3074608 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 687.311854] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f0ce6690-7981-40fd-a962-1c0caa125326 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 687.346311] env[59369]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-fb3cb493-8472-45c1-9c62-5d72aedffe57 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 687.365052] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-978ff0ea-8ee2-46bb-9ed3-c408e2a5f288 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 687.400702] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-401c804b-92a9-4cd2-8947-220939ed7cfa {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 687.411691] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-90f8dbf4-ec6c-4409-9cfe-b0d8315d446c {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 687.416603] env[59369]: DEBUG nova.policy [None req-4803dc67-a3e4-4ac4-af85-869de62e91a4 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a17e081723084de6bd7a85520d63f837', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6582e45deec14a68b4f22faa53d43d42', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59369) authorize /opt/stack/nova/nova/policy.py:203}} [ 687.435503] env[59369]: DEBUG nova.virt.block_device [None req-4803dc67-a3e4-4ac4-af85-869de62e91a4 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Updating existing volume attachment record: 856b91b7-0380-48b9-834e-c9904ff0759a {{(pid=59369) _volume_attach /opt/stack/nova/nova/virt/block_device.py:631}} [ 687.728890] env[59369]: DEBUG nova.compute.manager [None req-4803dc67-a3e4-4ac4-af85-869de62e91a4 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Start spawning the instance on the hypervisor. {{(pid=59369) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 687.729442] env[59369]: DEBUG nova.virt.hardware [None req-4803dc67-a3e4-4ac4-af85-869de62e91a4 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-08-29T11:50:58Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=,container_format=,created_at=,direct_url=,disk_format=,id=,min_disk=0,min_ram=0,name=,owner=,properties=ImageMetaProps,protected=,size=1073741824,status='active',tags=,updated_at=,virtual_size=,visibility=), allow threads: False {{(pid=59369) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 687.729631] env[59369]: DEBUG nova.virt.hardware [None req-4803dc67-a3e4-4ac4-af85-869de62e91a4 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Flavor limits 0:0:0 {{(pid=59369) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 687.729772] env[59369]: DEBUG nova.virt.hardware [None req-4803dc67-a3e4-4ac4-af85-869de62e91a4 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Image limits 0:0:0 {{(pid=59369) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 687.729938] env[59369]: DEBUG nova.virt.hardware [None req-4803dc67-a3e4-4ac4-af85-869de62e91a4 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Flavor pref 0:0:0 {{(pid=59369) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 687.730286] env[59369]: DEBUG nova.virt.hardware [None req-4803dc67-a3e4-4ac4-af85-869de62e91a4 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Image pref 0:0:0 {{(pid=59369) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 687.730503] env[59369]: DEBUG nova.virt.hardware [None req-4803dc67-a3e4-4ac4-af85-869de62e91a4 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59369) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 687.730768] env[59369]: DEBUG nova.virt.hardware [None req-4803dc67-a3e4-4ac4-af85-869de62e91a4 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59369) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 687.730972] env[59369]: DEBUG nova.virt.hardware [None req-4803dc67-a3e4-4ac4-af85-869de62e91a4 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59369) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 687.731204] env[59369]: DEBUG nova.virt.hardware [None req-4803dc67-a3e4-4ac4-af85-869de62e91a4 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Got 1 possible topologies {{(pid=59369) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 687.731559] env[59369]: DEBUG nova.virt.hardware [None req-4803dc67-a3e4-4ac4-af85-869de62e91a4 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59369) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 687.731789] env[59369]: DEBUG nova.virt.hardware [None req-4803dc67-a3e4-4ac4-af85-869de62e91a4 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59369) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 687.732960] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-266612e4-8d62-4ebb-9e83-d65653eca53c {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 687.742667] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a5cea049-5ea1-437f-acb5-3f784dc6a71e {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 688.899269] env[59369]: DEBUG nova.network.neutron [None req-3b1eb679-35b4-4cf7-a1c9-cb785ca4878f tempest-FloatingIPsAssociationNegativeTestJSON-821667375 tempest-FloatingIPsAssociationNegativeTestJSON-821667375-project-member] [instance: 8960dfc9-2908-4113-bebd-3a045d3e135c] Successfully created port: d2a414fc-289a-4eaa-8df0-e878c26e007d {{(pid=59369) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 689.334373] env[59369]: DEBUG oslo_concurrency.lockutils [None req-95154408-165f-4ede-9a4d-9cfcdbce8906 tempest-VolumesAdminNegativeTest-879448813 tempest-VolumesAdminNegativeTest-879448813-project-member] Acquiring lock "ed81e73a-8787-4c22-842b-891d2ef116c7" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 689.334602] env[59369]: DEBUG oslo_concurrency.lockutils [None req-95154408-165f-4ede-9a4d-9cfcdbce8906 tempest-VolumesAdminNegativeTest-879448813 tempest-VolumesAdminNegativeTest-879448813-project-member] Lock "ed81e73a-8787-4c22-842b-891d2ef116c7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 689.617573] env[59369]: DEBUG nova.network.neutron [None req-4803dc67-a3e4-4ac4-af85-869de62e91a4 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Successfully created port: 8c655868-3e30-4f41-8743-7288d0faa33c {{(pid=59369) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 689.724838] env[59369]: DEBUG oslo_concurrency.lockutils [None req-2dbb99fa-2d1d-4cbf-ad2c-9c865e038551 tempest-ServerTagsTestJSON-972196144 tempest-ServerTagsTestJSON-972196144-project-member] Acquiring lock "4062c0d2-5018-4a4f-9bc6-2fbe57a92a16" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 689.725059] env[59369]: DEBUG oslo_concurrency.lockutils [None req-2dbb99fa-2d1d-4cbf-ad2c-9c865e038551 tempest-ServerTagsTestJSON-972196144 tempest-ServerTagsTestJSON-972196144-project-member] Lock "4062c0d2-5018-4a4f-9bc6-2fbe57a92a16" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 690.541050] env[59369]: DEBUG oslo_concurrency.lockutils [None req-edb64f36-6ae4-4beb-b34c-bb6b27abdd85 tempest-MultipleCreateTestJSON-1252325267 tempest-MultipleCreateTestJSON-1252325267-project-member] Acquiring lock "f1815366-ecdb-4c46-a719-15f0ecdf5717" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 690.541413] env[59369]: DEBUG oslo_concurrency.lockutils [None req-edb64f36-6ae4-4beb-b34c-bb6b27abdd85 tempest-MultipleCreateTestJSON-1252325267 tempest-MultipleCreateTestJSON-1252325267-project-member] Lock "f1815366-ecdb-4c46-a719-15f0ecdf5717" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 690.567582] env[59369]: DEBUG oslo_concurrency.lockutils [None req-edb64f36-6ae4-4beb-b34c-bb6b27abdd85 tempest-MultipleCreateTestJSON-1252325267 tempest-MultipleCreateTestJSON-1252325267-project-member] Acquiring lock "f6b26955-b900-40d2-a910-57a4435c629c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 690.567805] env[59369]: DEBUG oslo_concurrency.lockutils [None req-edb64f36-6ae4-4beb-b34c-bb6b27abdd85 tempest-MultipleCreateTestJSON-1252325267 tempest-MultipleCreateTestJSON-1252325267-project-member] Lock "f6b26955-b900-40d2-a910-57a4435c629c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 690.943938] env[59369]: DEBUG oslo_concurrency.lockutils [None req-e8d0e7cd-999b-44b4-88e4-510f0fdb0b9f tempest-ImagesTestJSON-709925325 tempest-ImagesTestJSON-709925325-project-member] Acquiring lock "598a1045-cf7a-43e5-ae0d-438b246b6f7d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 690.944587] env[59369]: DEBUG oslo_concurrency.lockutils [None req-e8d0e7cd-999b-44b4-88e4-510f0fdb0b9f tempest-ImagesTestJSON-709925325 tempest-ImagesTestJSON-709925325-project-member] Lock "598a1045-cf7a-43e5-ae0d-438b246b6f7d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 691.401800] env[59369]: DEBUG oslo_concurrency.lockutils [None req-787bc5bf-967b-4766-b919-151d586b46a6 tempest-ServerGroupTestJSON-535910261 tempest-ServerGroupTestJSON-535910261-project-member] Acquiring lock "2e0fe984-3c0c-40cb-95e0-0090f7b1b36c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 691.402029] env[59369]: DEBUG oslo_concurrency.lockutils [None req-787bc5bf-967b-4766-b919-151d586b46a6 tempest-ServerGroupTestJSON-535910261 tempest-ServerGroupTestJSON-535910261-project-member] Lock "2e0fe984-3c0c-40cb-95e0-0090f7b1b36c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 692.216826] env[59369]: DEBUG nova.network.neutron [None req-3b1eb679-35b4-4cf7-a1c9-cb785ca4878f tempest-FloatingIPsAssociationNegativeTestJSON-821667375 tempest-FloatingIPsAssociationNegativeTestJSON-821667375-project-member] [instance: 8960dfc9-2908-4113-bebd-3a045d3e135c] Successfully updated port: d2a414fc-289a-4eaa-8df0-e878c26e007d {{(pid=59369) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 692.228052] env[59369]: DEBUG oslo_concurrency.lockutils [None req-3b1eb679-35b4-4cf7-a1c9-cb785ca4878f tempest-FloatingIPsAssociationNegativeTestJSON-821667375 tempest-FloatingIPsAssociationNegativeTestJSON-821667375-project-member] Acquiring lock "refresh_cache-8960dfc9-2908-4113-bebd-3a045d3e135c" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 692.229056] env[59369]: DEBUG oslo_concurrency.lockutils [None req-3b1eb679-35b4-4cf7-a1c9-cb785ca4878f tempest-FloatingIPsAssociationNegativeTestJSON-821667375 tempest-FloatingIPsAssociationNegativeTestJSON-821667375-project-member] Acquired lock "refresh_cache-8960dfc9-2908-4113-bebd-3a045d3e135c" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 692.229056] env[59369]: DEBUG nova.network.neutron [None req-3b1eb679-35b4-4cf7-a1c9-cb785ca4878f tempest-FloatingIPsAssociationNegativeTestJSON-821667375 tempest-FloatingIPsAssociationNegativeTestJSON-821667375-project-member] [instance: 8960dfc9-2908-4113-bebd-3a045d3e135c] Building network info cache for instance {{(pid=59369) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 692.381956] env[59369]: DEBUG nova.network.neutron [None req-3b1eb679-35b4-4cf7-a1c9-cb785ca4878f tempest-FloatingIPsAssociationNegativeTestJSON-821667375 tempest-FloatingIPsAssociationNegativeTestJSON-821667375-project-member] [instance: 8960dfc9-2908-4113-bebd-3a045d3e135c] Instance cache missing network info. {{(pid=59369) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 692.518101] env[59369]: DEBUG oslo_concurrency.lockutils [None req-d9d7d4cc-b035-447c-82ab-a53a5b6d9913 tempest-InstanceActionsTestJSON-606357078 tempest-InstanceActionsTestJSON-606357078-project-member] Acquiring lock "c6e58407-630f-4c11-a810-2ffec58fc397" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 692.519332] env[59369]: DEBUG oslo_concurrency.lockutils [None req-d9d7d4cc-b035-447c-82ab-a53a5b6d9913 tempest-InstanceActionsTestJSON-606357078 tempest-InstanceActionsTestJSON-606357078-project-member] Lock "c6e58407-630f-4c11-a810-2ffec58fc397" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 692.764546] env[59369]: DEBUG nova.network.neutron [None req-4803dc67-a3e4-4ac4-af85-869de62e91a4 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Successfully updated port: 8c655868-3e30-4f41-8743-7288d0faa33c {{(pid=59369) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 692.776071] env[59369]: DEBUG oslo_concurrency.lockutils [None req-4803dc67-a3e4-4ac4-af85-869de62e91a4 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Acquiring lock "refresh_cache-fe4378ed-29fd-4b4e-8943-80c833830357" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 692.776218] env[59369]: DEBUG oslo_concurrency.lockutils [None req-4803dc67-a3e4-4ac4-af85-869de62e91a4 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Acquired lock "refresh_cache-fe4378ed-29fd-4b4e-8943-80c833830357" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 692.776362] env[59369]: DEBUG nova.network.neutron [None req-4803dc67-a3e4-4ac4-af85-869de62e91a4 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Building network info cache for instance {{(pid=59369) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 692.893885] env[59369]: DEBUG nova.network.neutron [None req-4803dc67-a3e4-4ac4-af85-869de62e91a4 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Instance cache missing network info. {{(pid=59369) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 693.086346] env[59369]: DEBUG nova.network.neutron [None req-3b1eb679-35b4-4cf7-a1c9-cb785ca4878f tempest-FloatingIPsAssociationNegativeTestJSON-821667375 tempest-FloatingIPsAssociationNegativeTestJSON-821667375-project-member] [instance: 8960dfc9-2908-4113-bebd-3a045d3e135c] Updating instance_info_cache with network_info: [{"id": "d2a414fc-289a-4eaa-8df0-e878c26e007d", "address": "fa:16:3e:cf:ba:81", "network": {"id": "57080e74-6ee8-44af-ae2f-4edd2e463a12", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1348241518-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "9ca16ba07aa94c24be48f9d25258d2aa", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "46785c9c-8b22-487d-a854-b3e67c5ed1d7", "external-id": "nsx-vlan-transportzone-430", "segmentation_id": 430, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd2a414fc-28", "ovs_interfaceid": "d2a414fc-289a-4eaa-8df0-e878c26e007d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59369) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 693.103975] env[59369]: DEBUG oslo_concurrency.lockutils [None req-3b1eb679-35b4-4cf7-a1c9-cb785ca4878f tempest-FloatingIPsAssociationNegativeTestJSON-821667375 tempest-FloatingIPsAssociationNegativeTestJSON-821667375-project-member] Releasing lock "refresh_cache-8960dfc9-2908-4113-bebd-3a045d3e135c" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 693.104356] env[59369]: DEBUG nova.compute.manager [None req-3b1eb679-35b4-4cf7-a1c9-cb785ca4878f tempest-FloatingIPsAssociationNegativeTestJSON-821667375 tempest-FloatingIPsAssociationNegativeTestJSON-821667375-project-member] [instance: 8960dfc9-2908-4113-bebd-3a045d3e135c] Instance network_info: |[{"id": "d2a414fc-289a-4eaa-8df0-e878c26e007d", "address": "fa:16:3e:cf:ba:81", "network": {"id": "57080e74-6ee8-44af-ae2f-4edd2e463a12", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1348241518-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "9ca16ba07aa94c24be48f9d25258d2aa", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "46785c9c-8b22-487d-a854-b3e67c5ed1d7", "external-id": "nsx-vlan-transportzone-430", "segmentation_id": 430, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd2a414fc-28", "ovs_interfaceid": "d2a414fc-289a-4eaa-8df0-e878c26e007d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59369) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 693.104715] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-3b1eb679-35b4-4cf7-a1c9-cb785ca4878f tempest-FloatingIPsAssociationNegativeTestJSON-821667375 tempest-FloatingIPsAssociationNegativeTestJSON-821667375-project-member] [instance: 8960dfc9-2908-4113-bebd-3a045d3e135c] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:cf:ba:81', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '46785c9c-8b22-487d-a854-b3e67c5ed1d7', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'd2a414fc-289a-4eaa-8df0-e878c26e007d', 'vif_model': 'vmxnet3'}] {{(pid=59369) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 693.118703] env[59369]: DEBUG nova.virt.vmwareapi.vm_util [None req-3b1eb679-35b4-4cf7-a1c9-cb785ca4878f tempest-FloatingIPsAssociationNegativeTestJSON-821667375 tempest-FloatingIPsAssociationNegativeTestJSON-821667375-project-member] Creating folder: Project (9ca16ba07aa94c24be48f9d25258d2aa). Parent ref: group-v121837. {{(pid=59369) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 693.125982] env[59369]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-010d815d-5ed6-4d11-91ac-c27f79dd9487 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 693.143294] env[59369]: INFO nova.virt.vmwareapi.vm_util [None req-3b1eb679-35b4-4cf7-a1c9-cb785ca4878f tempest-FloatingIPsAssociationNegativeTestJSON-821667375 tempest-FloatingIPsAssociationNegativeTestJSON-821667375-project-member] Created folder: Project (9ca16ba07aa94c24be48f9d25258d2aa) in parent group-v121837. [ 693.143918] env[59369]: DEBUG nova.virt.vmwareapi.vm_util [None req-3b1eb679-35b4-4cf7-a1c9-cb785ca4878f tempest-FloatingIPsAssociationNegativeTestJSON-821667375 tempest-FloatingIPsAssociationNegativeTestJSON-821667375-project-member] Creating folder: Instances. Parent ref: group-v121873. {{(pid=59369) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 693.144200] env[59369]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-9ca8d191-c837-459f-a4bb-b907497f1857 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 693.154143] env[59369]: INFO nova.virt.vmwareapi.vm_util [None req-3b1eb679-35b4-4cf7-a1c9-cb785ca4878f tempest-FloatingIPsAssociationNegativeTestJSON-821667375 tempest-FloatingIPsAssociationNegativeTestJSON-821667375-project-member] Created folder: Instances in parent group-v121873. [ 693.154435] env[59369]: DEBUG oslo.service.loopingcall [None req-3b1eb679-35b4-4cf7-a1c9-cb785ca4878f tempest-FloatingIPsAssociationNegativeTestJSON-821667375 tempest-FloatingIPsAssociationNegativeTestJSON-821667375-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59369) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 693.154630] env[59369]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8960dfc9-2908-4113-bebd-3a045d3e135c] Creating VM on the ESX host {{(pid=59369) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 693.154827] env[59369]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-9f0a1374-a814-48f2-adbf-5cf5eea777c9 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 693.184898] env[59369]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 693.184898] env[59369]: value = "task-463275" [ 693.184898] env[59369]: _type = "Task" [ 693.184898] env[59369]: } to complete. {{(pid=59369) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 693.192995] env[59369]: DEBUG oslo_vmware.api [-] Task: {'id': task-463275, 'name': CreateVM_Task} progress is 0%. {{(pid=59369) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 693.411390] env[59369]: DEBUG nova.compute.manager [req-82708835-500b-44e6-a642-bf51b1314e7d req-79ec1985-09c1-4103-b0a3-904ffceaa948 service nova] [instance: 8960dfc9-2908-4113-bebd-3a045d3e135c] Received event network-vif-plugged-d2a414fc-289a-4eaa-8df0-e878c26e007d {{(pid=59369) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 693.411658] env[59369]: DEBUG oslo_concurrency.lockutils [req-82708835-500b-44e6-a642-bf51b1314e7d req-79ec1985-09c1-4103-b0a3-904ffceaa948 service nova] Acquiring lock "8960dfc9-2908-4113-bebd-3a045d3e135c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 693.411939] env[59369]: DEBUG oslo_concurrency.lockutils [req-82708835-500b-44e6-a642-bf51b1314e7d req-79ec1985-09c1-4103-b0a3-904ffceaa948 service nova] Lock "8960dfc9-2908-4113-bebd-3a045d3e135c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 693.412170] env[59369]: DEBUG oslo_concurrency.lockutils [req-82708835-500b-44e6-a642-bf51b1314e7d req-79ec1985-09c1-4103-b0a3-904ffceaa948 service nova] Lock "8960dfc9-2908-4113-bebd-3a045d3e135c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 693.412395] env[59369]: DEBUG nova.compute.manager [req-82708835-500b-44e6-a642-bf51b1314e7d req-79ec1985-09c1-4103-b0a3-904ffceaa948 service nova] [instance: 8960dfc9-2908-4113-bebd-3a045d3e135c] No waiting events found dispatching network-vif-plugged-d2a414fc-289a-4eaa-8df0-e878c26e007d {{(pid=59369) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 693.412847] env[59369]: WARNING nova.compute.manager [req-82708835-500b-44e6-a642-bf51b1314e7d req-79ec1985-09c1-4103-b0a3-904ffceaa948 service nova] [instance: 8960dfc9-2908-4113-bebd-3a045d3e135c] Received unexpected event network-vif-plugged-d2a414fc-289a-4eaa-8df0-e878c26e007d for instance with vm_state building and task_state spawning. [ 693.423880] env[59369]: DEBUG nova.network.neutron [None req-4803dc67-a3e4-4ac4-af85-869de62e91a4 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Updating instance_info_cache with network_info: [{"id": "8c655868-3e30-4f41-8743-7288d0faa33c", "address": "fa:16:3e:1f:9a:56", "network": {"id": "65965d49-7ad0-4e0b-a8a4-5905cad20d37", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-2066486182-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "6582e45deec14a68b4f22faa53d43d42", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "82dbbfe2-640b-433f-a8e9-1566bd40fb34", "external-id": "nsx-vlan-transportzone-625", "segmentation_id": 625, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8c655868-3e", "ovs_interfaceid": "8c655868-3e30-4f41-8743-7288d0faa33c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59369) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 693.444695] env[59369]: DEBUG oslo_concurrency.lockutils [None req-4803dc67-a3e4-4ac4-af85-869de62e91a4 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Releasing lock "refresh_cache-fe4378ed-29fd-4b4e-8943-80c833830357" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 693.445026] env[59369]: DEBUG nova.compute.manager [None req-4803dc67-a3e4-4ac4-af85-869de62e91a4 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Instance network_info: |[{"id": "8c655868-3e30-4f41-8743-7288d0faa33c", "address": "fa:16:3e:1f:9a:56", "network": {"id": "65965d49-7ad0-4e0b-a8a4-5905cad20d37", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-2066486182-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "6582e45deec14a68b4f22faa53d43d42", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "82dbbfe2-640b-433f-a8e9-1566bd40fb34", "external-id": "nsx-vlan-transportzone-625", "segmentation_id": 625, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8c655868-3e", "ovs_interfaceid": "8c655868-3e30-4f41-8743-7288d0faa33c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59369) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 693.445396] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-4803dc67-a3e4-4ac4-af85-869de62e91a4 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:1f:9a:56', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '82dbbfe2-640b-433f-a8e9-1566bd40fb34', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '8c655868-3e30-4f41-8743-7288d0faa33c', 'vif_model': 'vmxnet3'}] {{(pid=59369) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 693.455278] env[59369]: DEBUG nova.virt.vmwareapi.vm_util [None req-4803dc67-a3e4-4ac4-af85-869de62e91a4 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Creating folder: Project (6582e45deec14a68b4f22faa53d43d42). Parent ref: group-v121837. {{(pid=59369) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 693.456043] env[59369]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-993ac398-cd1e-46e9-bdae-8ade61f1b0b9 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 693.472957] env[59369]: WARNING suds.client [-] Web service reported a SOAP processing fault using an unexpected HTTP status code 200. Reporting as an internal server error. [ 693.473160] env[59369]: DEBUG oslo_vmware.api [-] Fault list: [DuplicateName] {{(pid=59369) _invoke_api /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:337}} [ 693.473699] env[59369]: DEBUG nova.virt.vmwareapi.vm_util [None req-4803dc67-a3e4-4ac4-af85-869de62e91a4 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Folder already exists: Project (6582e45deec14a68b4f22faa53d43d42). Parent ref: group-v121837. {{(pid=59369) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1599}} [ 693.473893] env[59369]: DEBUG nova.virt.vmwareapi.vm_util [None req-4803dc67-a3e4-4ac4-af85-869de62e91a4 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Creating folder: Instances. Parent ref: group-v121869. {{(pid=59369) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 693.474548] env[59369]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-d8047691-3262-401e-bc44-dc378d03739e {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 693.486819] env[59369]: INFO nova.virt.vmwareapi.vm_util [None req-4803dc67-a3e4-4ac4-af85-869de62e91a4 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Created folder: Instances in parent group-v121869. [ 693.487066] env[59369]: DEBUG oslo.service.loopingcall [None req-4803dc67-a3e4-4ac4-af85-869de62e91a4 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59369) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 693.487264] env[59369]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Creating VM on the ESX host {{(pid=59369) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 693.487465] env[59369]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-c7dcc97c-1206-4a58-b12f-677864ae4145 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 693.514223] env[59369]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 693.514223] env[59369]: value = "task-463278" [ 693.514223] env[59369]: _type = "Task" [ 693.514223] env[59369]: } to complete. {{(pid=59369) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 693.525106] env[59369]: DEBUG oslo_vmware.api [-] Task: {'id': task-463278, 'name': CreateVM_Task} progress is 0%. {{(pid=59369) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 693.696359] env[59369]: DEBUG oslo_vmware.api [-] Task: {'id': task-463275, 'name': CreateVM_Task, 'duration_secs': 0.352634} completed successfully. {{(pid=59369) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 693.696359] env[59369]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8960dfc9-2908-4113-bebd-3a045d3e135c] Created VM on the ESX host {{(pid=59369) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 693.697677] env[59369]: DEBUG oslo_concurrency.lockutils [None req-3b1eb679-35b4-4cf7-a1c9-cb785ca4878f tempest-FloatingIPsAssociationNegativeTestJSON-821667375 tempest-FloatingIPsAssociationNegativeTestJSON-821667375-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/dc3bad91-06a4-485e-bdc4-b0135384e686" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 693.697965] env[59369]: DEBUG oslo_concurrency.lockutils [None req-3b1eb679-35b4-4cf7-a1c9-cb785ca4878f tempest-FloatingIPsAssociationNegativeTestJSON-821667375 tempest-FloatingIPsAssociationNegativeTestJSON-821667375-project-member] Acquired lock "[datastore1] devstack-image-cache_base/dc3bad91-06a4-485e-bdc4-b0135384e686" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 693.698823] env[59369]: DEBUG oslo_concurrency.lockutils [None req-3b1eb679-35b4-4cf7-a1c9-cb785ca4878f tempest-FloatingIPsAssociationNegativeTestJSON-821667375 tempest-FloatingIPsAssociationNegativeTestJSON-821667375-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/dc3bad91-06a4-485e-bdc4-b0135384e686" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 693.699210] env[59369]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b77e0ab3-5ec8-4c36-b96b-e58223d2affd {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 693.707027] env[59369]: DEBUG oslo_vmware.api [None req-3b1eb679-35b4-4cf7-a1c9-cb785ca4878f tempest-FloatingIPsAssociationNegativeTestJSON-821667375 tempest-FloatingIPsAssociationNegativeTestJSON-821667375-project-member] Waiting for the task: (returnval){ [ 693.707027] env[59369]: value = "session[52794835-9974-5e05-43d4-e917bd2adbc9]5245a163-e801-a280-926c-7f962a2ec81f" [ 693.707027] env[59369]: _type = "Task" [ 693.707027] env[59369]: } to complete. {{(pid=59369) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 693.717436] env[59369]: DEBUG oslo_vmware.api [None req-3b1eb679-35b4-4cf7-a1c9-cb785ca4878f tempest-FloatingIPsAssociationNegativeTestJSON-821667375 tempest-FloatingIPsAssociationNegativeTestJSON-821667375-project-member] Task: {'id': session[52794835-9974-5e05-43d4-e917bd2adbc9]5245a163-e801-a280-926c-7f962a2ec81f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59369) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 694.000515] env[59369]: DEBUG oslo_concurrency.lockutils [None req-a934eed2-a9bd-441f-9ec0-98d48dacdafd tempest-DeleteServersTestJSON-1791923723 tempest-DeleteServersTestJSON-1791923723-project-member] Acquiring lock "11a0c528-536f-4df5-906a-d4fa9fd581d4" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 694.000736] env[59369]: DEBUG oslo_concurrency.lockutils [None req-a934eed2-a9bd-441f-9ec0-98d48dacdafd tempest-DeleteServersTestJSON-1791923723 tempest-DeleteServersTestJSON-1791923723-project-member] Lock "11a0c528-536f-4df5-906a-d4fa9fd581d4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 694.028310] env[59369]: DEBUG oslo_vmware.api [-] Task: {'id': task-463278, 'name': CreateVM_Task, 'duration_secs': 0.299528} completed successfully. {{(pid=59369) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 694.028476] env[59369]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Created VM on the ESX host {{(pid=59369) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 694.029071] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-4803dc67-a3e4-4ac4-af85-869de62e91a4 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Block device information present: {'root_device_name': '/dev/sda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'boot_index': 0, 'guest_format': None, 'device_type': None, 'attachment_id': '856b91b7-0380-48b9-834e-c9904ff0759a', 'mount_device': '/dev/sda', 'delete_on_termination': True, 'connection_info': {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-121872', 'volume_id': '3a1b2e8d-d5a5-493f-907f-3cfc470a56e3', 'name': 'volume-3a1b2e8d-d5a5-493f-907f-3cfc470a56e3', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': 'fe4378ed-29fd-4b4e-8943-80c833830357', 'attached_at': '', 'detached_at': '', 'volume_id': '3a1b2e8d-d5a5-493f-907f-3cfc470a56e3', 'serial': '3a1b2e8d-d5a5-493f-907f-3cfc470a56e3'}, 'disk_bus': None, 'volume_type': None}], 'swap': None} {{(pid=59369) spawn /opt/stack/nova/nova/virt/vmwareapi/vmops.py:799}} [ 694.029453] env[59369]: DEBUG nova.virt.vmwareapi.volumeops [None req-4803dc67-a3e4-4ac4-af85-869de62e91a4 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Root volume attach. Driver type: vmdk {{(pid=59369) attach_root_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:661}} [ 694.030270] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5807895f-0575-44d8-8bb8-84fc49b7e3cf {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 694.041285] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ea32f025-040f-4dbd-941c-9c3a6c810cba {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 694.048613] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f8ced4ea-93d4-4321-b3f3-4a260448266b {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 694.055918] env[59369]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.RelocateVM_Task with opID=oslo.vmware-649e2123-526c-4140-8dfc-b35d1806027b {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 694.065987] env[59369]: DEBUG oslo_vmware.api [None req-4803dc67-a3e4-4ac4-af85-869de62e91a4 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Waiting for the task: (returnval){ [ 694.065987] env[59369]: value = "task-463279" [ 694.065987] env[59369]: _type = "Task" [ 694.065987] env[59369]: } to complete. {{(pid=59369) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 694.075632] env[59369]: DEBUG oslo_vmware.api [None req-4803dc67-a3e4-4ac4-af85-869de62e91a4 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Task: {'id': task-463279, 'name': RelocateVM_Task} progress is 5%. {{(pid=59369) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 694.223885] env[59369]: DEBUG oslo_concurrency.lockutils [None req-3b1eb679-35b4-4cf7-a1c9-cb785ca4878f tempest-FloatingIPsAssociationNegativeTestJSON-821667375 tempest-FloatingIPsAssociationNegativeTestJSON-821667375-project-member] Releasing lock "[datastore1] devstack-image-cache_base/dc3bad91-06a4-485e-bdc4-b0135384e686" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 694.227373] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-3b1eb679-35b4-4cf7-a1c9-cb785ca4878f tempest-FloatingIPsAssociationNegativeTestJSON-821667375 tempest-FloatingIPsAssociationNegativeTestJSON-821667375-project-member] [instance: 8960dfc9-2908-4113-bebd-3a045d3e135c] Processing image dc3bad91-06a4-485e-bdc4-b0135384e686 {{(pid=59369) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 694.227623] env[59369]: DEBUG oslo_concurrency.lockutils [None req-3b1eb679-35b4-4cf7-a1c9-cb785ca4878f tempest-FloatingIPsAssociationNegativeTestJSON-821667375 tempest-FloatingIPsAssociationNegativeTestJSON-821667375-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/dc3bad91-06a4-485e-bdc4-b0135384e686/dc3bad91-06a4-485e-bdc4-b0135384e686.vmdk" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 694.578045] env[59369]: DEBUG oslo_vmware.api [None req-4803dc67-a3e4-4ac4-af85-869de62e91a4 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Task: {'id': task-463279, 'name': RelocateVM_Task, 'duration_secs': 0.378809} completed successfully. {{(pid=59369) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 694.579273] env[59369]: DEBUG nova.virt.vmwareapi.volumeops [None req-4803dc67-a3e4-4ac4-af85-869de62e91a4 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Volume attach. Driver type: vmdk {{(pid=59369) attach_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:439}} [ 694.579273] env[59369]: DEBUG nova.virt.vmwareapi.volumeops [None req-4803dc67-a3e4-4ac4-af85-869de62e91a4 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] _attach_volume_vmdk: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-121872', 'volume_id': '3a1b2e8d-d5a5-493f-907f-3cfc470a56e3', 'name': 'volume-3a1b2e8d-d5a5-493f-907f-3cfc470a56e3', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': 'fe4378ed-29fd-4b4e-8943-80c833830357', 'attached_at': '', 'detached_at': '', 'volume_id': '3a1b2e8d-d5a5-493f-907f-3cfc470a56e3', 'serial': '3a1b2e8d-d5a5-493f-907f-3cfc470a56e3'} {{(pid=59369) _attach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:336}} [ 694.579825] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-144b0750-163f-4625-afae-c8d6ec1dc5b8 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 694.600767] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-581e0f82-00cf-4436-bdff-c98fce66e04f {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 694.628255] env[59369]: DEBUG nova.virt.vmwareapi.volumeops [None req-4803dc67-a3e4-4ac4-af85-869de62e91a4 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Reconfiguring VM instance instance-0000000b to attach disk [datastore1] volume-3a1b2e8d-d5a5-493f-907f-3cfc470a56e3/volume-3a1b2e8d-d5a5-493f-907f-3cfc470a56e3.vmdk or device None with type thin {{(pid=59369) attach_disk_to_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:81}} [ 694.628613] env[59369]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-b6c502af-f87d-47c4-b236-64b5bab211eb {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 694.649497] env[59369]: DEBUG oslo_vmware.api [None req-4803dc67-a3e4-4ac4-af85-869de62e91a4 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Waiting for the task: (returnval){ [ 694.649497] env[59369]: value = "task-463280" [ 694.649497] env[59369]: _type = "Task" [ 694.649497] env[59369]: } to complete. {{(pid=59369) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 694.661307] env[59369]: DEBUG oslo_vmware.api [None req-4803dc67-a3e4-4ac4-af85-869de62e91a4 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Task: {'id': task-463280, 'name': ReconfigVM_Task} progress is 5%. {{(pid=59369) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 695.164256] env[59369]: DEBUG oslo_vmware.api [None req-4803dc67-a3e4-4ac4-af85-869de62e91a4 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Task: {'id': task-463280, 'name': ReconfigVM_Task, 'duration_secs': 0.272338} completed successfully. {{(pid=59369) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 695.164256] env[59369]: DEBUG nova.virt.vmwareapi.volumeops [None req-4803dc67-a3e4-4ac4-af85-869de62e91a4 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Reconfigured VM instance instance-0000000b to attach disk [datastore1] volume-3a1b2e8d-d5a5-493f-907f-3cfc470a56e3/volume-3a1b2e8d-d5a5-493f-907f-3cfc470a56e3.vmdk or device None with type thin {{(pid=59369) attach_disk_to_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:88}} [ 695.169480] env[59369]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-64214153-3046-4998-87d6-2ed41dc2aa23 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 695.189019] env[59369]: DEBUG oslo_vmware.api [None req-4803dc67-a3e4-4ac4-af85-869de62e91a4 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Waiting for the task: (returnval){ [ 695.189019] env[59369]: value = "task-463281" [ 695.189019] env[59369]: _type = "Task" [ 695.189019] env[59369]: } to complete. {{(pid=59369) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 695.198890] env[59369]: DEBUG oslo_vmware.api [None req-4803dc67-a3e4-4ac4-af85-869de62e91a4 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Task: {'id': task-463281, 'name': ReconfigVM_Task} progress is 5%. {{(pid=59369) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 695.702358] env[59369]: DEBUG oslo_vmware.api [None req-4803dc67-a3e4-4ac4-af85-869de62e91a4 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Task: {'id': task-463281, 'name': ReconfigVM_Task, 'duration_secs': 0.12941} completed successfully. {{(pid=59369) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 695.704492] env[59369]: DEBUG nova.virt.vmwareapi.volumeops [None req-4803dc67-a3e4-4ac4-af85-869de62e91a4 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Attached VMDK: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-121872', 'volume_id': '3a1b2e8d-d5a5-493f-907f-3cfc470a56e3', 'name': 'volume-3a1b2e8d-d5a5-493f-907f-3cfc470a56e3', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': 'fe4378ed-29fd-4b4e-8943-80c833830357', 'attached_at': '', 'detached_at': '', 'volume_id': '3a1b2e8d-d5a5-493f-907f-3cfc470a56e3', 'serial': '3a1b2e8d-d5a5-493f-907f-3cfc470a56e3'} {{(pid=59369) _attach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:361}} [ 695.705724] env[59369]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.Rename_Task with opID=oslo.vmware-32d68fb1-1d32-430a-927c-732db73bc480 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 695.714273] env[59369]: DEBUG oslo_vmware.api [None req-4803dc67-a3e4-4ac4-af85-869de62e91a4 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Waiting for the task: (returnval){ [ 695.714273] env[59369]: value = "task-463282" [ 695.714273] env[59369]: _type = "Task" [ 695.714273] env[59369]: } to complete. {{(pid=59369) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 695.727796] env[59369]: DEBUG oslo_vmware.api [None req-4803dc67-a3e4-4ac4-af85-869de62e91a4 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Task: {'id': task-463282, 'name': Rename_Task} progress is 5%. {{(pid=59369) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 696.007544] env[59369]: DEBUG oslo_concurrency.lockutils [None req-d0e794cd-886b-4304-b14a-61236edd6571 tempest-AttachInterfacesTestJSON-767239700 tempest-AttachInterfacesTestJSON-767239700-project-member] Acquiring lock "fbc72ed9-2e72-4c09-bfaa-48d56df2edc0" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 696.008716] env[59369]: DEBUG oslo_concurrency.lockutils [None req-d0e794cd-886b-4304-b14a-61236edd6571 tempest-AttachInterfacesTestJSON-767239700 tempest-AttachInterfacesTestJSON-767239700-project-member] Lock "fbc72ed9-2e72-4c09-bfaa-48d56df2edc0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 696.225487] env[59369]: DEBUG oslo_vmware.api [None req-4803dc67-a3e4-4ac4-af85-869de62e91a4 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Task: {'id': task-463282, 'name': Rename_Task, 'duration_secs': 0.126715} completed successfully. {{(pid=59369) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 696.225781] env[59369]: DEBUG nova.virt.vmwareapi.vm_util [None req-4803dc67-a3e4-4ac4-af85-869de62e91a4 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Powering on the VM {{(pid=59369) power_on_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1442}} [ 696.226022] env[59369]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.PowerOnVM_Task with opID=oslo.vmware-04a536e4-24f1-4c48-ad28-f06191bd4ce9 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 696.234042] env[59369]: DEBUG oslo_vmware.api [None req-4803dc67-a3e4-4ac4-af85-869de62e91a4 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Waiting for the task: (returnval){ [ 696.234042] env[59369]: value = "task-463283" [ 696.234042] env[59369]: _type = "Task" [ 696.234042] env[59369]: } to complete. {{(pid=59369) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 696.244363] env[59369]: DEBUG oslo_vmware.api [None req-4803dc67-a3e4-4ac4-af85-869de62e91a4 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Task: {'id': task-463283, 'name': PowerOnVM_Task} progress is 0%. {{(pid=59369) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 696.751562] env[59369]: DEBUG oslo_vmware.api [None req-4803dc67-a3e4-4ac4-af85-869de62e91a4 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Task: {'id': task-463283, 'name': PowerOnVM_Task, 'duration_secs': 0.514469} completed successfully. {{(pid=59369) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 696.756279] env[59369]: DEBUG nova.virt.vmwareapi.vm_util [None req-4803dc67-a3e4-4ac4-af85-869de62e91a4 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Powered on the VM {{(pid=59369) power_on_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1448}} [ 696.756279] env[59369]: INFO nova.compute.manager [None req-4803dc67-a3e4-4ac4-af85-869de62e91a4 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Took 9.02 seconds to spawn the instance on the hypervisor. [ 696.756279] env[59369]: DEBUG nova.compute.manager [None req-4803dc67-a3e4-4ac4-af85-869de62e91a4 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Checking state {{(pid=59369) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} [ 696.756279] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-13858a5d-8531-4f99-a48d-dc34546b91ac {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 696.853498] env[59369]: INFO nova.compute.manager [None req-4803dc67-a3e4-4ac4-af85-869de62e91a4 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Took 10.06 seconds to build instance. [ 696.875481] env[59369]: DEBUG oslo_concurrency.lockutils [None req-4803dc67-a3e4-4ac4-af85-869de62e91a4 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Lock "fe4378ed-29fd-4b4e-8943-80c833830357" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 10.131s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 696.893732] env[59369]: DEBUG nova.compute.manager [None req-95154408-165f-4ede-9a4d-9cfcdbce8906 tempest-VolumesAdminNegativeTest-879448813 tempest-VolumesAdminNegativeTest-879448813-project-member] [instance: ed81e73a-8787-4c22-842b-891d2ef116c7] Starting instance... {{(pid=59369) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 696.967105] env[59369]: DEBUG oslo_concurrency.lockutils [None req-95154408-165f-4ede-9a4d-9cfcdbce8906 tempest-VolumesAdminNegativeTest-879448813 tempest-VolumesAdminNegativeTest-879448813-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 696.967385] env[59369]: DEBUG oslo_concurrency.lockutils [None req-95154408-165f-4ede-9a4d-9cfcdbce8906 tempest-VolumesAdminNegativeTest-879448813 tempest-VolumesAdminNegativeTest-879448813-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 696.969037] env[59369]: INFO nova.compute.claims [None req-95154408-165f-4ede-9a4d-9cfcdbce8906 tempest-VolumesAdminNegativeTest-879448813 tempest-VolumesAdminNegativeTest-879448813-project-member] [instance: ed81e73a-8787-4c22-842b-891d2ef116c7] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 697.389559] env[59369]: DEBUG oslo_concurrency.lockutils [None req-6cfd08bb-415c-409b-b448-06ac3919f0f1 tempest-ServersTestJSON-2081867747 tempest-ServersTestJSON-2081867747-project-member] Acquiring lock "a7c91b61-dc9b-4d79-b18f-4339b2248e04" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 697.390382] env[59369]: DEBUG oslo_concurrency.lockutils [None req-6cfd08bb-415c-409b-b448-06ac3919f0f1 tempest-ServersTestJSON-2081867747 tempest-ServersTestJSON-2081867747-project-member] Lock "a7c91b61-dc9b-4d79-b18f-4339b2248e04" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 697.447791] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-81143228-757d-4c99-903d-c38d2c7bafaa {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 697.458592] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-69e1943d-8f9a-43ff-9a44-1e1356a4a7fa {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 697.498488] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c7c1a67f-9649-4c24-afe3-4aba86de1a6c {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 697.507735] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-33b50896-32f2-4225-89aa-2b1acff82fda {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 697.523810] env[59369]: DEBUG nova.compute.provider_tree [None req-95154408-165f-4ede-9a4d-9cfcdbce8906 tempest-VolumesAdminNegativeTest-879448813 tempest-VolumesAdminNegativeTest-879448813-project-member] Inventory has not changed in ProviderTree for provider: 6ace6145-3535-4f74-aa29-80f64a201369 {{(pid=59369) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 697.535874] env[59369]: DEBUG nova.scheduler.client.report [None req-95154408-165f-4ede-9a4d-9cfcdbce8906 tempest-VolumesAdminNegativeTest-879448813 tempest-VolumesAdminNegativeTest-879448813-project-member] Inventory has not changed for provider 6ace6145-3535-4f74-aa29-80f64a201369 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 116, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59369) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 697.557779] env[59369]: DEBUG oslo_concurrency.lockutils [None req-95154408-165f-4ede-9a4d-9cfcdbce8906 tempest-VolumesAdminNegativeTest-879448813 tempest-VolumesAdminNegativeTest-879448813-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.590s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 697.559112] env[59369]: DEBUG nova.compute.manager [None req-95154408-165f-4ede-9a4d-9cfcdbce8906 tempest-VolumesAdminNegativeTest-879448813 tempest-VolumesAdminNegativeTest-879448813-project-member] [instance: ed81e73a-8787-4c22-842b-891d2ef116c7] Start building networks asynchronously for instance. {{(pid=59369) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 697.610819] env[59369]: DEBUG nova.compute.utils [None req-95154408-165f-4ede-9a4d-9cfcdbce8906 tempest-VolumesAdminNegativeTest-879448813 tempest-VolumesAdminNegativeTest-879448813-project-member] Using /dev/sd instead of None {{(pid=59369) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 697.614237] env[59369]: DEBUG nova.compute.manager [None req-95154408-165f-4ede-9a4d-9cfcdbce8906 tempest-VolumesAdminNegativeTest-879448813 tempest-VolumesAdminNegativeTest-879448813-project-member] [instance: ed81e73a-8787-4c22-842b-891d2ef116c7] Allocating IP information in the background. {{(pid=59369) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 697.614237] env[59369]: DEBUG nova.network.neutron [None req-95154408-165f-4ede-9a4d-9cfcdbce8906 tempest-VolumesAdminNegativeTest-879448813 tempest-VolumesAdminNegativeTest-879448813-project-member] [instance: ed81e73a-8787-4c22-842b-891d2ef116c7] allocate_for_instance() {{(pid=59369) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 697.621609] env[59369]: DEBUG nova.compute.manager [None req-95154408-165f-4ede-9a4d-9cfcdbce8906 tempest-VolumesAdminNegativeTest-879448813 tempest-VolumesAdminNegativeTest-879448813-project-member] [instance: ed81e73a-8787-4c22-842b-891d2ef116c7] Start building block device mappings for instance. {{(pid=59369) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 697.704184] env[59369]: DEBUG nova.compute.manager [None req-95154408-165f-4ede-9a4d-9cfcdbce8906 tempest-VolumesAdminNegativeTest-879448813 tempest-VolumesAdminNegativeTest-879448813-project-member] [instance: ed81e73a-8787-4c22-842b-891d2ef116c7] Start spawning the instance on the hypervisor. {{(pid=59369) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 697.741149] env[59369]: DEBUG nova.virt.hardware [None req-95154408-165f-4ede-9a4d-9cfcdbce8906 tempest-VolumesAdminNegativeTest-879448813 tempest-VolumesAdminNegativeTest-879448813-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-08-29T11:50:58Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-08-29T11:50:41Z,direct_url=,disk_format='vmdk',id=dc3bad91-06a4-485e-bdc4-b0135384e686,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='7386534ab8334806bb50f77e095e084c',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-08-29T11:50:42Z,virtual_size=,visibility=), allow threads: False {{(pid=59369) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 697.741405] env[59369]: DEBUG nova.virt.hardware [None req-95154408-165f-4ede-9a4d-9cfcdbce8906 tempest-VolumesAdminNegativeTest-879448813 tempest-VolumesAdminNegativeTest-879448813-project-member] Flavor limits 0:0:0 {{(pid=59369) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 697.741556] env[59369]: DEBUG nova.virt.hardware [None req-95154408-165f-4ede-9a4d-9cfcdbce8906 tempest-VolumesAdminNegativeTest-879448813 tempest-VolumesAdminNegativeTest-879448813-project-member] Image limits 0:0:0 {{(pid=59369) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 697.742093] env[59369]: DEBUG nova.virt.hardware [None req-95154408-165f-4ede-9a4d-9cfcdbce8906 tempest-VolumesAdminNegativeTest-879448813 tempest-VolumesAdminNegativeTest-879448813-project-member] Flavor pref 0:0:0 {{(pid=59369) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 697.742093] env[59369]: DEBUG nova.virt.hardware [None req-95154408-165f-4ede-9a4d-9cfcdbce8906 tempest-VolumesAdminNegativeTest-879448813 tempest-VolumesAdminNegativeTest-879448813-project-member] Image pref 0:0:0 {{(pid=59369) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 697.742093] env[59369]: DEBUG nova.virt.hardware [None req-95154408-165f-4ede-9a4d-9cfcdbce8906 tempest-VolumesAdminNegativeTest-879448813 tempest-VolumesAdminNegativeTest-879448813-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59369) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 697.742258] env[59369]: DEBUG nova.virt.hardware [None req-95154408-165f-4ede-9a4d-9cfcdbce8906 tempest-VolumesAdminNegativeTest-879448813 tempest-VolumesAdminNegativeTest-879448813-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59369) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 697.742356] env[59369]: DEBUG nova.virt.hardware [None req-95154408-165f-4ede-9a4d-9cfcdbce8906 tempest-VolumesAdminNegativeTest-879448813 tempest-VolumesAdminNegativeTest-879448813-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59369) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 697.742512] env[59369]: DEBUG nova.virt.hardware [None req-95154408-165f-4ede-9a4d-9cfcdbce8906 tempest-VolumesAdminNegativeTest-879448813 tempest-VolumesAdminNegativeTest-879448813-project-member] Got 1 possible topologies {{(pid=59369) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 697.742665] env[59369]: DEBUG nova.virt.hardware [None req-95154408-165f-4ede-9a4d-9cfcdbce8906 tempest-VolumesAdminNegativeTest-879448813 tempest-VolumesAdminNegativeTest-879448813-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59369) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 697.742830] env[59369]: DEBUG nova.virt.hardware [None req-95154408-165f-4ede-9a4d-9cfcdbce8906 tempest-VolumesAdminNegativeTest-879448813 tempest-VolumesAdminNegativeTest-879448813-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59369) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 697.744876] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-58fc78a5-9035-49e9-8876-48a87f28b084 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 697.755566] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-39809e18-a41d-4a95-866a-eaa5b12fd617 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 697.862653] env[59369]: DEBUG nova.policy [None req-95154408-165f-4ede-9a4d-9cfcdbce8906 tempest-VolumesAdminNegativeTest-879448813 tempest-VolumesAdminNegativeTest-879448813-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'db2636136bed439e9fdf2f95b05d2ee2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '42bad7c8d9fb429f9f2bf0a732bf1459', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59369) authorize /opt/stack/nova/nova/policy.py:203}} [ 698.152394] env[59369]: DEBUG oslo_concurrency.lockutils [None req-3c8bf8ea-6e31-4e45-a208-85e40e6aa4e5 tempest-ServerAddressesNegativeTestJSON-1689922645 tempest-ServerAddressesNegativeTestJSON-1689922645-project-member] Acquiring lock "35ba3bec-5278-496f-aa6c-a87861e1d8e9" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 698.154021] env[59369]: DEBUG oslo_concurrency.lockutils [None req-3c8bf8ea-6e31-4e45-a208-85e40e6aa4e5 tempest-ServerAddressesNegativeTestJSON-1689922645 tempest-ServerAddressesNegativeTestJSON-1689922645-project-member] Lock "35ba3bec-5278-496f-aa6c-a87861e1d8e9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 698.238633] env[59369]: DEBUG nova.compute.manager [req-690ceaa4-e727-4263-ab34-8ff776a796f3 req-64c34146-2d98-40fb-bdd3-797408963900 service nova] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Received event network-vif-plugged-8c655868-3e30-4f41-8743-7288d0faa33c {{(pid=59369) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 698.238633] env[59369]: DEBUG oslo_concurrency.lockutils [req-690ceaa4-e727-4263-ab34-8ff776a796f3 req-64c34146-2d98-40fb-bdd3-797408963900 service nova] Acquiring lock "fe4378ed-29fd-4b4e-8943-80c833830357-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 698.239115] env[59369]: DEBUG oslo_concurrency.lockutils [req-690ceaa4-e727-4263-ab34-8ff776a796f3 req-64c34146-2d98-40fb-bdd3-797408963900 service nova] Lock "fe4378ed-29fd-4b4e-8943-80c833830357-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 698.239440] env[59369]: DEBUG oslo_concurrency.lockutils [req-690ceaa4-e727-4263-ab34-8ff776a796f3 req-64c34146-2d98-40fb-bdd3-797408963900 service nova] Lock "fe4378ed-29fd-4b4e-8943-80c833830357-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 698.239713] env[59369]: DEBUG nova.compute.manager [req-690ceaa4-e727-4263-ab34-8ff776a796f3 req-64c34146-2d98-40fb-bdd3-797408963900 service nova] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] No waiting events found dispatching network-vif-plugged-8c655868-3e30-4f41-8743-7288d0faa33c {{(pid=59369) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 698.239963] env[59369]: WARNING nova.compute.manager [req-690ceaa4-e727-4263-ab34-8ff776a796f3 req-64c34146-2d98-40fb-bdd3-797408963900 service nova] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Received unexpected event network-vif-plugged-8c655868-3e30-4f41-8743-7288d0faa33c for instance with vm_state active and task_state None. [ 698.240217] env[59369]: DEBUG nova.compute.manager [req-690ceaa4-e727-4263-ab34-8ff776a796f3 req-64c34146-2d98-40fb-bdd3-797408963900 service nova] [instance: 8960dfc9-2908-4113-bebd-3a045d3e135c] Received event network-changed-d2a414fc-289a-4eaa-8df0-e878c26e007d {{(pid=59369) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 698.240491] env[59369]: DEBUG nova.compute.manager [req-690ceaa4-e727-4263-ab34-8ff776a796f3 req-64c34146-2d98-40fb-bdd3-797408963900 service nova] [instance: 8960dfc9-2908-4113-bebd-3a045d3e135c] Refreshing instance network info cache due to event network-changed-d2a414fc-289a-4eaa-8df0-e878c26e007d. {{(pid=59369) external_instance_event /opt/stack/nova/nova/compute/manager.py:11007}} [ 698.241251] env[59369]: DEBUG oslo_concurrency.lockutils [req-690ceaa4-e727-4263-ab34-8ff776a796f3 req-64c34146-2d98-40fb-bdd3-797408963900 service nova] Acquiring lock "refresh_cache-8960dfc9-2908-4113-bebd-3a045d3e135c" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 698.241483] env[59369]: DEBUG oslo_concurrency.lockutils [req-690ceaa4-e727-4263-ab34-8ff776a796f3 req-64c34146-2d98-40fb-bdd3-797408963900 service nova] Acquired lock "refresh_cache-8960dfc9-2908-4113-bebd-3a045d3e135c" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 698.241721] env[59369]: DEBUG nova.network.neutron [req-690ceaa4-e727-4263-ab34-8ff776a796f3 req-64c34146-2d98-40fb-bdd3-797408963900 service nova] [instance: 8960dfc9-2908-4113-bebd-3a045d3e135c] Refreshing network info cache for port d2a414fc-289a-4eaa-8df0-e878c26e007d {{(pid=59369) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 699.268803] env[59369]: DEBUG oslo_concurrency.lockutils [None req-5ad99b9c-53ee-4005-b2c6-f32fb9fad62d tempest-ServersTestJSON-1277878412 tempest-ServersTestJSON-1277878412-project-member] Acquiring lock "3106cf2f-1f97-4f1e-94a0-718f5fb735c8" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 699.269543] env[59369]: DEBUG oslo_concurrency.lockutils [None req-5ad99b9c-53ee-4005-b2c6-f32fb9fad62d tempest-ServersTestJSON-1277878412 tempest-ServersTestJSON-1277878412-project-member] Lock "3106cf2f-1f97-4f1e-94a0-718f5fb735c8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 699.404372] env[59369]: DEBUG nova.network.neutron [req-690ceaa4-e727-4263-ab34-8ff776a796f3 req-64c34146-2d98-40fb-bdd3-797408963900 service nova] [instance: 8960dfc9-2908-4113-bebd-3a045d3e135c] Updated VIF entry in instance network info cache for port d2a414fc-289a-4eaa-8df0-e878c26e007d. {{(pid=59369) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 699.404862] env[59369]: DEBUG nova.network.neutron [req-690ceaa4-e727-4263-ab34-8ff776a796f3 req-64c34146-2d98-40fb-bdd3-797408963900 service nova] [instance: 8960dfc9-2908-4113-bebd-3a045d3e135c] Updating instance_info_cache with network_info: [{"id": "d2a414fc-289a-4eaa-8df0-e878c26e007d", "address": "fa:16:3e:cf:ba:81", "network": {"id": "57080e74-6ee8-44af-ae2f-4edd2e463a12", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1348241518-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "9ca16ba07aa94c24be48f9d25258d2aa", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "46785c9c-8b22-487d-a854-b3e67c5ed1d7", "external-id": "nsx-vlan-transportzone-430", "segmentation_id": 430, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd2a414fc-28", "ovs_interfaceid": "d2a414fc-289a-4eaa-8df0-e878c26e007d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59369) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 699.421145] env[59369]: DEBUG oslo_concurrency.lockutils [req-690ceaa4-e727-4263-ab34-8ff776a796f3 req-64c34146-2d98-40fb-bdd3-797408963900 service nova] Releasing lock "refresh_cache-8960dfc9-2908-4113-bebd-3a045d3e135c" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 699.421145] env[59369]: DEBUG nova.compute.manager [req-690ceaa4-e727-4263-ab34-8ff776a796f3 req-64c34146-2d98-40fb-bdd3-797408963900 service nova] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Received event network-changed-8c655868-3e30-4f41-8743-7288d0faa33c {{(pid=59369) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 699.421755] env[59369]: DEBUG nova.compute.manager [req-690ceaa4-e727-4263-ab34-8ff776a796f3 req-64c34146-2d98-40fb-bdd3-797408963900 service nova] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Refreshing instance network info cache due to event network-changed-8c655868-3e30-4f41-8743-7288d0faa33c. {{(pid=59369) external_instance_event /opt/stack/nova/nova/compute/manager.py:11007}} [ 699.421755] env[59369]: DEBUG oslo_concurrency.lockutils [req-690ceaa4-e727-4263-ab34-8ff776a796f3 req-64c34146-2d98-40fb-bdd3-797408963900 service nova] Acquiring lock "refresh_cache-fe4378ed-29fd-4b4e-8943-80c833830357" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 699.421755] env[59369]: DEBUG oslo_concurrency.lockutils [req-690ceaa4-e727-4263-ab34-8ff776a796f3 req-64c34146-2d98-40fb-bdd3-797408963900 service nova] Acquired lock "refresh_cache-fe4378ed-29fd-4b4e-8943-80c833830357" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 699.421894] env[59369]: DEBUG nova.network.neutron [req-690ceaa4-e727-4263-ab34-8ff776a796f3 req-64c34146-2d98-40fb-bdd3-797408963900 service nova] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Refreshing network info cache for port 8c655868-3e30-4f41-8743-7288d0faa33c {{(pid=59369) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 699.430561] env[59369]: DEBUG nova.network.neutron [None req-95154408-165f-4ede-9a4d-9cfcdbce8906 tempest-VolumesAdminNegativeTest-879448813 tempest-VolumesAdminNegativeTest-879448813-project-member] [instance: ed81e73a-8787-4c22-842b-891d2ef116c7] Successfully created port: 0a1a07b5-a5ca-4866-8f69-c8b21ce29893 {{(pid=59369) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 699.790899] env[59369]: DEBUG oslo_concurrency.lockutils [None req-ad8ad49c-4af1-4656-bbc6-ec09092e6f4f tempest-ServerShowV247Test-127796081 tempest-ServerShowV247Test-127796081-project-member] Acquiring lock "4de279b1-f754-4c7b-a53c-90824b17e766" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 699.791165] env[59369]: DEBUG oslo_concurrency.lockutils [None req-ad8ad49c-4af1-4656-bbc6-ec09092e6f4f tempest-ServerShowV247Test-127796081 tempest-ServerShowV247Test-127796081-project-member] Lock "4de279b1-f754-4c7b-a53c-90824b17e766" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 700.301577] env[59369]: DEBUG nova.network.neutron [req-690ceaa4-e727-4263-ab34-8ff776a796f3 req-64c34146-2d98-40fb-bdd3-797408963900 service nova] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Updated VIF entry in instance network info cache for port 8c655868-3e30-4f41-8743-7288d0faa33c. {{(pid=59369) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 700.302638] env[59369]: DEBUG nova.network.neutron [req-690ceaa4-e727-4263-ab34-8ff776a796f3 req-64c34146-2d98-40fb-bdd3-797408963900 service nova] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Updating instance_info_cache with network_info: [{"id": "8c655868-3e30-4f41-8743-7288d0faa33c", "address": "fa:16:3e:1f:9a:56", "network": {"id": "65965d49-7ad0-4e0b-a8a4-5905cad20d37", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-2066486182-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "6582e45deec14a68b4f22faa53d43d42", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "82dbbfe2-640b-433f-a8e9-1566bd40fb34", "external-id": "nsx-vlan-transportzone-625", "segmentation_id": 625, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8c655868-3e", "ovs_interfaceid": "8c655868-3e30-4f41-8743-7288d0faa33c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59369) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 700.327614] env[59369]: DEBUG oslo_concurrency.lockutils [req-690ceaa4-e727-4263-ab34-8ff776a796f3 req-64c34146-2d98-40fb-bdd3-797408963900 service nova] Releasing lock "refresh_cache-fe4378ed-29fd-4b4e-8943-80c833830357" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 700.803973] env[59369]: DEBUG nova.network.neutron [None req-95154408-165f-4ede-9a4d-9cfcdbce8906 tempest-VolumesAdminNegativeTest-879448813 tempest-VolumesAdminNegativeTest-879448813-project-member] [instance: ed81e73a-8787-4c22-842b-891d2ef116c7] Successfully updated port: 0a1a07b5-a5ca-4866-8f69-c8b21ce29893 {{(pid=59369) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 700.819527] env[59369]: DEBUG oslo_concurrency.lockutils [None req-95154408-165f-4ede-9a4d-9cfcdbce8906 tempest-VolumesAdminNegativeTest-879448813 tempest-VolumesAdminNegativeTest-879448813-project-member] Acquiring lock "refresh_cache-ed81e73a-8787-4c22-842b-891d2ef116c7" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 700.819680] env[59369]: DEBUG oslo_concurrency.lockutils [None req-95154408-165f-4ede-9a4d-9cfcdbce8906 tempest-VolumesAdminNegativeTest-879448813 tempest-VolumesAdminNegativeTest-879448813-project-member] Acquired lock "refresh_cache-ed81e73a-8787-4c22-842b-891d2ef116c7" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 700.819829] env[59369]: DEBUG nova.network.neutron [None req-95154408-165f-4ede-9a4d-9cfcdbce8906 tempest-VolumesAdminNegativeTest-879448813 tempest-VolumesAdminNegativeTest-879448813-project-member] [instance: ed81e73a-8787-4c22-842b-891d2ef116c7] Building network info cache for instance {{(pid=59369) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 700.878204] env[59369]: DEBUG nova.network.neutron [None req-95154408-165f-4ede-9a4d-9cfcdbce8906 tempest-VolumesAdminNegativeTest-879448813 tempest-VolumesAdminNegativeTest-879448813-project-member] [instance: ed81e73a-8787-4c22-842b-891d2ef116c7] Instance cache missing network info. {{(pid=59369) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 701.315928] env[59369]: DEBUG nova.network.neutron [None req-95154408-165f-4ede-9a4d-9cfcdbce8906 tempest-VolumesAdminNegativeTest-879448813 tempest-VolumesAdminNegativeTest-879448813-project-member] [instance: ed81e73a-8787-4c22-842b-891d2ef116c7] Updating instance_info_cache with network_info: [{"id": "0a1a07b5-a5ca-4866-8f69-c8b21ce29893", "address": "fa:16:3e:fb:ca:d6", "network": {"id": "4538f772-a7e1-470f-9ef6-d9da7ffc52b8", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-703548089-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "42bad7c8d9fb429f9f2bf0a732bf1459", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8cb478a6-872c-4a90-a8db-526b374e82ce", "external-id": "nsx-vlan-transportzone-835", "segmentation_id": 835, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0a1a07b5-a5", "ovs_interfaceid": "0a1a07b5-a5ca-4866-8f69-c8b21ce29893", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59369) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 701.330405] env[59369]: DEBUG oslo_concurrency.lockutils [None req-95154408-165f-4ede-9a4d-9cfcdbce8906 tempest-VolumesAdminNegativeTest-879448813 tempest-VolumesAdminNegativeTest-879448813-project-member] Releasing lock "refresh_cache-ed81e73a-8787-4c22-842b-891d2ef116c7" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 701.330761] env[59369]: DEBUG nova.compute.manager [None req-95154408-165f-4ede-9a4d-9cfcdbce8906 tempest-VolumesAdminNegativeTest-879448813 tempest-VolumesAdminNegativeTest-879448813-project-member] [instance: ed81e73a-8787-4c22-842b-891d2ef116c7] Instance network_info: |[{"id": "0a1a07b5-a5ca-4866-8f69-c8b21ce29893", "address": "fa:16:3e:fb:ca:d6", "network": {"id": "4538f772-a7e1-470f-9ef6-d9da7ffc52b8", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-703548089-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "42bad7c8d9fb429f9f2bf0a732bf1459", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8cb478a6-872c-4a90-a8db-526b374e82ce", "external-id": "nsx-vlan-transportzone-835", "segmentation_id": 835, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0a1a07b5-a5", "ovs_interfaceid": "0a1a07b5-a5ca-4866-8f69-c8b21ce29893", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59369) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 701.331555] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-95154408-165f-4ede-9a4d-9cfcdbce8906 tempest-VolumesAdminNegativeTest-879448813 tempest-VolumesAdminNegativeTest-879448813-project-member] [instance: ed81e73a-8787-4c22-842b-891d2ef116c7] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:fb:ca:d6', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '8cb478a6-872c-4a90-a8db-526b374e82ce', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '0a1a07b5-a5ca-4866-8f69-c8b21ce29893', 'vif_model': 'vmxnet3'}] {{(pid=59369) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 701.340243] env[59369]: DEBUG nova.virt.vmwareapi.vm_util [None req-95154408-165f-4ede-9a4d-9cfcdbce8906 tempest-VolumesAdminNegativeTest-879448813 tempest-VolumesAdminNegativeTest-879448813-project-member] Creating folder: Project (42bad7c8d9fb429f9f2bf0a732bf1459). Parent ref: group-v121837. {{(pid=59369) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 701.341342] env[59369]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-46b06698-72ad-43c7-b25a-b6eecf22fa35 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 701.356190] env[59369]: INFO nova.virt.vmwareapi.vm_util [None req-95154408-165f-4ede-9a4d-9cfcdbce8906 tempest-VolumesAdminNegativeTest-879448813 tempest-VolumesAdminNegativeTest-879448813-project-member] Created folder: Project (42bad7c8d9fb429f9f2bf0a732bf1459) in parent group-v121837. [ 701.356190] env[59369]: DEBUG nova.virt.vmwareapi.vm_util [None req-95154408-165f-4ede-9a4d-9cfcdbce8906 tempest-VolumesAdminNegativeTest-879448813 tempest-VolumesAdminNegativeTest-879448813-project-member] Creating folder: Instances. Parent ref: group-v121878. {{(pid=59369) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 701.356190] env[59369]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-19ad1228-d259-481a-8981-581e0e49ecaa {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 701.366378] env[59369]: INFO nova.virt.vmwareapi.vm_util [None req-95154408-165f-4ede-9a4d-9cfcdbce8906 tempest-VolumesAdminNegativeTest-879448813 tempest-VolumesAdminNegativeTest-879448813-project-member] Created folder: Instances in parent group-v121878. [ 701.366739] env[59369]: DEBUG oslo.service.loopingcall [None req-95154408-165f-4ede-9a4d-9cfcdbce8906 tempest-VolumesAdminNegativeTest-879448813 tempest-VolumesAdminNegativeTest-879448813-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59369) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 701.366847] env[59369]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ed81e73a-8787-4c22-842b-891d2ef116c7] Creating VM on the ESX host {{(pid=59369) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 701.367040] env[59369]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-778eae93-dac1-47f8-8672-a8312149e806 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 701.389911] env[59369]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 701.389911] env[59369]: value = "task-463286" [ 701.389911] env[59369]: _type = "Task" [ 701.389911] env[59369]: } to complete. {{(pid=59369) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 701.398607] env[59369]: DEBUG oslo_vmware.api [-] Task: {'id': task-463286, 'name': CreateVM_Task} progress is 0%. {{(pid=59369) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 701.694806] env[59369]: DEBUG nova.compute.manager [req-b41b4eb5-ec6b-4314-a464-5bd1b202fa21 req-ae05712a-fdcd-44c3-b043-272b33dd6c35 service nova] [instance: ed81e73a-8787-4c22-842b-891d2ef116c7] Received event network-vif-plugged-0a1a07b5-a5ca-4866-8f69-c8b21ce29893 {{(pid=59369) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 701.695464] env[59369]: DEBUG oslo_concurrency.lockutils [req-b41b4eb5-ec6b-4314-a464-5bd1b202fa21 req-ae05712a-fdcd-44c3-b043-272b33dd6c35 service nova] Acquiring lock "ed81e73a-8787-4c22-842b-891d2ef116c7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 701.695464] env[59369]: DEBUG oslo_concurrency.lockutils [req-b41b4eb5-ec6b-4314-a464-5bd1b202fa21 req-ae05712a-fdcd-44c3-b043-272b33dd6c35 service nova] Lock "ed81e73a-8787-4c22-842b-891d2ef116c7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 701.695464] env[59369]: DEBUG oslo_concurrency.lockutils [req-b41b4eb5-ec6b-4314-a464-5bd1b202fa21 req-ae05712a-fdcd-44c3-b043-272b33dd6c35 service nova] Lock "ed81e73a-8787-4c22-842b-891d2ef116c7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 701.695714] env[59369]: DEBUG nova.compute.manager [req-b41b4eb5-ec6b-4314-a464-5bd1b202fa21 req-ae05712a-fdcd-44c3-b043-272b33dd6c35 service nova] [instance: ed81e73a-8787-4c22-842b-891d2ef116c7] No waiting events found dispatching network-vif-plugged-0a1a07b5-a5ca-4866-8f69-c8b21ce29893 {{(pid=59369) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 701.695752] env[59369]: WARNING nova.compute.manager [req-b41b4eb5-ec6b-4314-a464-5bd1b202fa21 req-ae05712a-fdcd-44c3-b043-272b33dd6c35 service nova] [instance: ed81e73a-8787-4c22-842b-891d2ef116c7] Received unexpected event network-vif-plugged-0a1a07b5-a5ca-4866-8f69-c8b21ce29893 for instance with vm_state building and task_state spawning. [ 701.902433] env[59369]: DEBUG oslo_vmware.api [-] Task: {'id': task-463286, 'name': CreateVM_Task, 'duration_secs': 0.409689} completed successfully. {{(pid=59369) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 701.902433] env[59369]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ed81e73a-8787-4c22-842b-891d2ef116c7] Created VM on the ESX host {{(pid=59369) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 701.902505] env[59369]: DEBUG oslo_concurrency.lockutils [None req-95154408-165f-4ede-9a4d-9cfcdbce8906 tempest-VolumesAdminNegativeTest-879448813 tempest-VolumesAdminNegativeTest-879448813-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/dc3bad91-06a4-485e-bdc4-b0135384e686" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 701.902616] env[59369]: DEBUG oslo_concurrency.lockutils [None req-95154408-165f-4ede-9a4d-9cfcdbce8906 tempest-VolumesAdminNegativeTest-879448813 tempest-VolumesAdminNegativeTest-879448813-project-member] Acquired lock "[datastore1] devstack-image-cache_base/dc3bad91-06a4-485e-bdc4-b0135384e686" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 701.904108] env[59369]: DEBUG oslo_concurrency.lockutils [None req-95154408-165f-4ede-9a4d-9cfcdbce8906 tempest-VolumesAdminNegativeTest-879448813 tempest-VolumesAdminNegativeTest-879448813-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/dc3bad91-06a4-485e-bdc4-b0135384e686" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 701.904384] env[59369]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-278124af-f1a2-472c-a29c-12b6625a5309 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 701.915989] env[59369]: DEBUG oslo_vmware.api [None req-95154408-165f-4ede-9a4d-9cfcdbce8906 tempest-VolumesAdminNegativeTest-879448813 tempest-VolumesAdminNegativeTest-879448813-project-member] Waiting for the task: (returnval){ [ 701.915989] env[59369]: value = "session[52794835-9974-5e05-43d4-e917bd2adbc9]52551ae6-3bc8-3bd1-1917-dc67fa612f9e" [ 701.915989] env[59369]: _type = "Task" [ 701.915989] env[59369]: } to complete. {{(pid=59369) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 701.928966] env[59369]: DEBUG oslo_vmware.api [None req-95154408-165f-4ede-9a4d-9cfcdbce8906 tempest-VolumesAdminNegativeTest-879448813 tempest-VolumesAdminNegativeTest-879448813-project-member] Task: {'id': session[52794835-9974-5e05-43d4-e917bd2adbc9]52551ae6-3bc8-3bd1-1917-dc67fa612f9e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59369) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 702.081803] env[59369]: DEBUG oslo_concurrency.lockutils [None req-e522b98b-0583-4ce8-a068-28c93e3fe4ff tempest-ServerShowV247Test-127796081 tempest-ServerShowV247Test-127796081-project-member] Acquiring lock "83fad30b-6d18-40af-89fa-ccbc25dcc968" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 702.081961] env[59369]: DEBUG oslo_concurrency.lockutils [None req-e522b98b-0583-4ce8-a068-28c93e3fe4ff tempest-ServerShowV247Test-127796081 tempest-ServerShowV247Test-127796081-project-member] Lock "83fad30b-6d18-40af-89fa-ccbc25dcc968" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 702.430895] env[59369]: DEBUG oslo_concurrency.lockutils [None req-95154408-165f-4ede-9a4d-9cfcdbce8906 tempest-VolumesAdminNegativeTest-879448813 tempest-VolumesAdminNegativeTest-879448813-project-member] Releasing lock "[datastore1] devstack-image-cache_base/dc3bad91-06a4-485e-bdc4-b0135384e686" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 702.431085] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-95154408-165f-4ede-9a4d-9cfcdbce8906 tempest-VolumesAdminNegativeTest-879448813 tempest-VolumesAdminNegativeTest-879448813-project-member] [instance: ed81e73a-8787-4c22-842b-891d2ef116c7] Processing image dc3bad91-06a4-485e-bdc4-b0135384e686 {{(pid=59369) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 702.431299] env[59369]: DEBUG oslo_concurrency.lockutils [None req-95154408-165f-4ede-9a4d-9cfcdbce8906 tempest-VolumesAdminNegativeTest-879448813 tempest-VolumesAdminNegativeTest-879448813-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/dc3bad91-06a4-485e-bdc4-b0135384e686/dc3bad91-06a4-485e-bdc4-b0135384e686.vmdk" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 703.772532] env[59369]: DEBUG oslo_concurrency.lockutils [None req-41f119e4-9387-456e-af99-4846729d6305 tempest-SecurityGroupsTestJSON-339733264 tempest-SecurityGroupsTestJSON-339733264-project-member] Acquiring lock "84c38578-ad32-4847-a6f1-94b2a288f89c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 703.772870] env[59369]: DEBUG oslo_concurrency.lockutils [None req-41f119e4-9387-456e-af99-4846729d6305 tempest-SecurityGroupsTestJSON-339733264 tempest-SecurityGroupsTestJSON-339733264-project-member] Lock "84c38578-ad32-4847-a6f1-94b2a288f89c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 704.043392] env[59369]: DEBUG nova.compute.manager [req-7b2e2faa-4243-479e-8fef-fe6f14379958 req-b9c3c30b-22c0-4340-8f64-b324835e4e88 service nova] [instance: ed81e73a-8787-4c22-842b-891d2ef116c7] Received event network-changed-0a1a07b5-a5ca-4866-8f69-c8b21ce29893 {{(pid=59369) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 704.043582] env[59369]: DEBUG nova.compute.manager [req-7b2e2faa-4243-479e-8fef-fe6f14379958 req-b9c3c30b-22c0-4340-8f64-b324835e4e88 service nova] [instance: ed81e73a-8787-4c22-842b-891d2ef116c7] Refreshing instance network info cache due to event network-changed-0a1a07b5-a5ca-4866-8f69-c8b21ce29893. {{(pid=59369) external_instance_event /opt/stack/nova/nova/compute/manager.py:11007}} [ 704.043793] env[59369]: DEBUG oslo_concurrency.lockutils [req-7b2e2faa-4243-479e-8fef-fe6f14379958 req-b9c3c30b-22c0-4340-8f64-b324835e4e88 service nova] Acquiring lock "refresh_cache-ed81e73a-8787-4c22-842b-891d2ef116c7" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 704.043927] env[59369]: DEBUG oslo_concurrency.lockutils [req-7b2e2faa-4243-479e-8fef-fe6f14379958 req-b9c3c30b-22c0-4340-8f64-b324835e4e88 service nova] Acquired lock "refresh_cache-ed81e73a-8787-4c22-842b-891d2ef116c7" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 704.044097] env[59369]: DEBUG nova.network.neutron [req-7b2e2faa-4243-479e-8fef-fe6f14379958 req-b9c3c30b-22c0-4340-8f64-b324835e4e88 service nova] [instance: ed81e73a-8787-4c22-842b-891d2ef116c7] Refreshing network info cache for port 0a1a07b5-a5ca-4866-8f69-c8b21ce29893 {{(pid=59369) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 704.195016] env[59369]: DEBUG oslo_concurrency.lockutils [None req-c565c454-e59d-43e3-882e-e9f8897c9d62 tempest-AttachVolumeShelveTestJSON-2049902658 tempest-AttachVolumeShelveTestJSON-2049902658-project-member] Acquiring lock "1ce49896-dfcf-4091-a1d2-257babf2e2cd" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 704.195276] env[59369]: DEBUG oslo_concurrency.lockutils [None req-c565c454-e59d-43e3-882e-e9f8897c9d62 tempest-AttachVolumeShelveTestJSON-2049902658 tempest-AttachVolumeShelveTestJSON-2049902658-project-member] Lock "1ce49896-dfcf-4091-a1d2-257babf2e2cd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 704.362476] env[59369]: DEBUG nova.network.neutron [req-7b2e2faa-4243-479e-8fef-fe6f14379958 req-b9c3c30b-22c0-4340-8f64-b324835e4e88 service nova] [instance: ed81e73a-8787-4c22-842b-891d2ef116c7] Updated VIF entry in instance network info cache for port 0a1a07b5-a5ca-4866-8f69-c8b21ce29893. {{(pid=59369) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 704.362816] env[59369]: DEBUG nova.network.neutron [req-7b2e2faa-4243-479e-8fef-fe6f14379958 req-b9c3c30b-22c0-4340-8f64-b324835e4e88 service nova] [instance: ed81e73a-8787-4c22-842b-891d2ef116c7] Updating instance_info_cache with network_info: [{"id": "0a1a07b5-a5ca-4866-8f69-c8b21ce29893", "address": "fa:16:3e:fb:ca:d6", "network": {"id": "4538f772-a7e1-470f-9ef6-d9da7ffc52b8", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-703548089-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "42bad7c8d9fb429f9f2bf0a732bf1459", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8cb478a6-872c-4a90-a8db-526b374e82ce", "external-id": "nsx-vlan-transportzone-835", "segmentation_id": 835, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0a1a07b5-a5", "ovs_interfaceid": "0a1a07b5-a5ca-4866-8f69-c8b21ce29893", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59369) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 704.376534] env[59369]: DEBUG oslo_concurrency.lockutils [req-7b2e2faa-4243-479e-8fef-fe6f14379958 req-b9c3c30b-22c0-4340-8f64-b324835e4e88 service nova] Releasing lock "refresh_cache-ed81e73a-8787-4c22-842b-891d2ef116c7" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 704.376787] env[59369]: DEBUG nova.compute.manager [req-7b2e2faa-4243-479e-8fef-fe6f14379958 req-b9c3c30b-22c0-4340-8f64-b324835e4e88 service nova] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Received event network-changed-8c655868-3e30-4f41-8743-7288d0faa33c {{(pid=59369) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 704.376945] env[59369]: DEBUG nova.compute.manager [req-7b2e2faa-4243-479e-8fef-fe6f14379958 req-b9c3c30b-22c0-4340-8f64-b324835e4e88 service nova] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Refreshing instance network info cache due to event network-changed-8c655868-3e30-4f41-8743-7288d0faa33c. {{(pid=59369) external_instance_event /opt/stack/nova/nova/compute/manager.py:11007}} [ 704.377181] env[59369]: DEBUG oslo_concurrency.lockutils [req-7b2e2faa-4243-479e-8fef-fe6f14379958 req-b9c3c30b-22c0-4340-8f64-b324835e4e88 service nova] Acquiring lock "refresh_cache-fe4378ed-29fd-4b4e-8943-80c833830357" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 704.378030] env[59369]: DEBUG oslo_concurrency.lockutils [req-7b2e2faa-4243-479e-8fef-fe6f14379958 req-b9c3c30b-22c0-4340-8f64-b324835e4e88 service nova] Acquired lock "refresh_cache-fe4378ed-29fd-4b4e-8943-80c833830357" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 704.378030] env[59369]: DEBUG nova.network.neutron [req-7b2e2faa-4243-479e-8fef-fe6f14379958 req-b9c3c30b-22c0-4340-8f64-b324835e4e88 service nova] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Refreshing network info cache for port 8c655868-3e30-4f41-8743-7288d0faa33c {{(pid=59369) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 704.733591] env[59369]: DEBUG nova.network.neutron [req-7b2e2faa-4243-479e-8fef-fe6f14379958 req-b9c3c30b-22c0-4340-8f64-b324835e4e88 service nova] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Updated VIF entry in instance network info cache for port 8c655868-3e30-4f41-8743-7288d0faa33c. {{(pid=59369) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 704.733855] env[59369]: DEBUG nova.network.neutron [req-7b2e2faa-4243-479e-8fef-fe6f14379958 req-b9c3c30b-22c0-4340-8f64-b324835e4e88 service nova] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Updating instance_info_cache with network_info: [{"id": "8c655868-3e30-4f41-8743-7288d0faa33c", "address": "fa:16:3e:1f:9a:56", "network": {"id": "65965d49-7ad0-4e0b-a8a4-5905cad20d37", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-2066486182-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "10.180.180.209", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "6582e45deec14a68b4f22faa53d43d42", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "82dbbfe2-640b-433f-a8e9-1566bd40fb34", "external-id": "nsx-vlan-transportzone-625", "segmentation_id": 625, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8c655868-3e", "ovs_interfaceid": "8c655868-3e30-4f41-8743-7288d0faa33c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59369) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 704.748305] env[59369]: DEBUG oslo_concurrency.lockutils [req-7b2e2faa-4243-479e-8fef-fe6f14379958 req-b9c3c30b-22c0-4340-8f64-b324835e4e88 service nova] Releasing lock "refresh_cache-fe4378ed-29fd-4b4e-8943-80c833830357" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 704.891139] env[59369]: WARNING oslo_vmware.rw_handles [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 704.891139] env[59369]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 704.891139] env[59369]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 704.891139] env[59369]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 704.891139] env[59369]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 704.891139] env[59369]: ERROR oslo_vmware.rw_handles response.begin() [ 704.891139] env[59369]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 704.891139] env[59369]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 704.891139] env[59369]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 704.891139] env[59369]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 704.891139] env[59369]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 704.891139] env[59369]: ERROR oslo_vmware.rw_handles [ 704.891925] env[59369]: DEBUG nova.virt.vmwareapi.images [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] Downloaded image file data dc3bad91-06a4-485e-bdc4-b0135384e686 to vmware_temp/4e685b79-4225-4aa4-b66b-439c73f57f65/dc3bad91-06a4-485e-bdc4-b0135384e686/tmp-sparse.vmdk on the data store datastore1 {{(pid=59369) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 704.893045] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] Caching image {{(pid=59369) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 704.893361] env[59369]: DEBUG nova.virt.vmwareapi.vm_util [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] Copying Virtual Disk [datastore1] vmware_temp/4e685b79-4225-4aa4-b66b-439c73f57f65/dc3bad91-06a4-485e-bdc4-b0135384e686/tmp-sparse.vmdk to [datastore1] vmware_temp/4e685b79-4225-4aa4-b66b-439c73f57f65/dc3bad91-06a4-485e-bdc4-b0135384e686/dc3bad91-06a4-485e-bdc4-b0135384e686.vmdk {{(pid=59369) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 704.895200] env[59369]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-efa0c8e1-6512-4669-9b44-73e59591efac {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 704.904026] env[59369]: DEBUG oslo_vmware.api [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] Waiting for the task: (returnval){ [ 704.904026] env[59369]: value = "task-463287" [ 704.904026] env[59369]: _type = "Task" [ 704.904026] env[59369]: } to complete. {{(pid=59369) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 704.912401] env[59369]: DEBUG oslo_vmware.api [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] Task: {'id': task-463287, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59369) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 705.415071] env[59369]: DEBUG oslo_vmware.exceptions [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] Fault InvalidArgument not matched. {{(pid=59369) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 705.415332] env[59369]: DEBUG oslo_concurrency.lockutils [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] Releasing lock "[datastore1] devstack-image-cache_base/dc3bad91-06a4-485e-bdc4-b0135384e686/dc3bad91-06a4-485e-bdc4-b0135384e686.vmdk" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 705.415885] env[59369]: ERROR nova.compute.manager [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 705.415885] env[59369]: Faults: ['InvalidArgument'] [ 705.415885] env[59369]: ERROR nova.compute.manager [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] Traceback (most recent call last): [ 705.415885] env[59369]: ERROR nova.compute.manager [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 705.415885] env[59369]: ERROR nova.compute.manager [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] yield resources [ 705.415885] env[59369]: ERROR nova.compute.manager [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 705.415885] env[59369]: ERROR nova.compute.manager [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] self.driver.spawn(context, instance, image_meta, [ 705.415885] env[59369]: ERROR nova.compute.manager [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 705.415885] env[59369]: ERROR nova.compute.manager [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] self._vmops.spawn(context, instance, image_meta, injected_files, [ 705.415885] env[59369]: ERROR nova.compute.manager [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 705.415885] env[59369]: ERROR nova.compute.manager [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] self._fetch_image_if_missing(context, vi) [ 705.415885] env[59369]: ERROR nova.compute.manager [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 705.415885] env[59369]: ERROR nova.compute.manager [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] image_cache(vi, tmp_image_ds_loc) [ 705.415885] env[59369]: ERROR nova.compute.manager [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 705.415885] env[59369]: ERROR nova.compute.manager [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] vm_util.copy_virtual_disk( [ 705.415885] env[59369]: ERROR nova.compute.manager [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 705.415885] env[59369]: ERROR nova.compute.manager [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] session._wait_for_task(vmdk_copy_task) [ 705.415885] env[59369]: ERROR nova.compute.manager [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 705.415885] env[59369]: ERROR nova.compute.manager [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] return self.wait_for_task(task_ref) [ 705.415885] env[59369]: ERROR nova.compute.manager [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 705.415885] env[59369]: ERROR nova.compute.manager [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] return evt.wait() [ 705.415885] env[59369]: ERROR nova.compute.manager [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 705.415885] env[59369]: ERROR nova.compute.manager [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] result = hub.switch() [ 705.415885] env[59369]: ERROR nova.compute.manager [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 705.415885] env[59369]: ERROR nova.compute.manager [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] return self.greenlet.switch() [ 705.415885] env[59369]: ERROR nova.compute.manager [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 705.415885] env[59369]: ERROR nova.compute.manager [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] self.f(*self.args, **self.kw) [ 705.415885] env[59369]: ERROR nova.compute.manager [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 705.415885] env[59369]: ERROR nova.compute.manager [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] raise exceptions.translate_fault(task_info.error) [ 705.415885] env[59369]: ERROR nova.compute.manager [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 705.415885] env[59369]: ERROR nova.compute.manager [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] Faults: ['InvalidArgument'] [ 705.415885] env[59369]: ERROR nova.compute.manager [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] [ 705.416674] env[59369]: INFO nova.compute.manager [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] Terminating instance [ 705.417719] env[59369]: DEBUG oslo_concurrency.lockutils [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] Acquired lock "[datastore1] devstack-image-cache_base/dc3bad91-06a4-485e-bdc4-b0135384e686/dc3bad91-06a4-485e-bdc4-b0135384e686.vmdk" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 705.417921] env[59369]: DEBUG nova.virt.vmwareapi.ds_util [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59369) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 705.418538] env[59369]: DEBUG nova.compute.manager [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] Start destroying the instance on the hypervisor. {{(pid=59369) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 705.418722] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] Destroying instance {{(pid=59369) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 705.418944] env[59369]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-4cd6a515-4bb5-4c49-9ba3-838e7f54839b {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 705.421375] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1f86d0eb-5dbc-4638-b489-8613ead125b7 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 705.430665] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] Unregistering the VM {{(pid=59369) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 705.430839] env[59369]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-750455d8-409d-4647-88ab-89cd7a0ded67 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 705.433246] env[59369]: DEBUG nova.virt.vmwareapi.ds_util [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59369) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 705.433415] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59369) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 705.434419] env[59369]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-31c20e34-4f4b-40db-a66c-a2df383cd701 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 705.439454] env[59369]: DEBUG oslo_vmware.api [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] Waiting for the task: (returnval){ [ 705.439454] env[59369]: value = "session[52794835-9974-5e05-43d4-e917bd2adbc9]52453a28-935a-3102-deac-15f4d3c4225c" [ 705.439454] env[59369]: _type = "Task" [ 705.439454] env[59369]: } to complete. {{(pid=59369) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 705.447344] env[59369]: DEBUG oslo_vmware.api [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] Task: {'id': session[52794835-9974-5e05-43d4-e917bd2adbc9]52453a28-935a-3102-deac-15f4d3c4225c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59369) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 705.503481] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] Unregistered the VM {{(pid=59369) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 705.503481] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] Deleting contents of the VM from datastore datastore1 {{(pid=59369) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 705.503481] env[59369]: DEBUG nova.virt.vmwareapi.ds_util [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] Deleting the datastore file [datastore1] b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0 {{(pid=59369) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 705.503794] env[59369]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-5e8c960d-af68-4f4d-a504-f6fa309508d5 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 705.510789] env[59369]: DEBUG oslo_vmware.api [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] Waiting for the task: (returnval){ [ 705.510789] env[59369]: value = "task-463289" [ 705.510789] env[59369]: _type = "Task" [ 705.510789] env[59369]: } to complete. {{(pid=59369) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 705.518798] env[59369]: DEBUG oslo_vmware.api [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] Task: {'id': task-463289, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59369) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 705.693048] env[59369]: DEBUG oslo_concurrency.lockutils [None req-8ca84474-39bd-47dd-8087-291cadcdff8d tempest-ServerShowV257Test-1198015423 tempest-ServerShowV257Test-1198015423-project-member] Acquiring lock "f6c3dea6-e885-448d-b711-5d1e430f0664" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 705.693048] env[59369]: DEBUG oslo_concurrency.lockutils [None req-8ca84474-39bd-47dd-8087-291cadcdff8d tempest-ServerShowV257Test-1198015423 tempest-ServerShowV257Test-1198015423-project-member] Lock "f6c3dea6-e885-448d-b711-5d1e430f0664" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 705.951064] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] Preparing fetch location {{(pid=59369) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 705.951475] env[59369]: DEBUG nova.virt.vmwareapi.ds_util [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] Creating directory with path [datastore1] vmware_temp/4c4e06b0-b2b9-4e89-b9a5-38be192f12a6/dc3bad91-06a4-485e-bdc4-b0135384e686 {{(pid=59369) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 705.951886] env[59369]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-93aa6dbc-63cb-4515-a7e3-b662c90d83e6 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 705.964077] env[59369]: DEBUG nova.virt.vmwareapi.ds_util [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] Created directory with path [datastore1] vmware_temp/4c4e06b0-b2b9-4e89-b9a5-38be192f12a6/dc3bad91-06a4-485e-bdc4-b0135384e686 {{(pid=59369) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 705.964282] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] Fetch image to [datastore1] vmware_temp/4c4e06b0-b2b9-4e89-b9a5-38be192f12a6/dc3bad91-06a4-485e-bdc4-b0135384e686/tmp-sparse.vmdk {{(pid=59369) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 705.964730] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] Downloading image file data dc3bad91-06a4-485e-bdc4-b0135384e686 to [datastore1] vmware_temp/4c4e06b0-b2b9-4e89-b9a5-38be192f12a6/dc3bad91-06a4-485e-bdc4-b0135384e686/tmp-sparse.vmdk on the data store datastore1 {{(pid=59369) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 705.965214] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ef38c8e8-66af-4dc2-9f00-874c9f4b57a2 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 705.972963] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b7264901-8267-4c65-b82c-232a2629e2e3 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 705.983119] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-34914412-b1bb-4fca-b842-1a9cd2cd07cb {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 706.019569] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9b3c643c-3ac0-4ed5-83b7-335b18c2b94c {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 706.030570] env[59369]: DEBUG oslo_vmware.api [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] Task: {'id': task-463289, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.077838} completed successfully. {{(pid=59369) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 706.031103] env[59369]: DEBUG nova.virt.vmwareapi.ds_util [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] Deleted the datastore file {{(pid=59369) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 706.031275] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] Deleted contents of the VM from datastore datastore1 {{(pid=59369) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 706.031497] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] Instance destroyed {{(pid=59369) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 706.031844] env[59369]: INFO nova.compute.manager [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] Took 0.61 seconds to destroy the instance on the hypervisor. [ 706.033163] env[59369]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-cd75cca5-0af5-4771-8628-3e3f427a216c {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 706.036978] env[59369]: DEBUG nova.compute.claims [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] Aborting claim: {{(pid=59369) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 706.037162] env[59369]: DEBUG oslo_concurrency.lockutils [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 706.037373] env[59369]: DEBUG oslo_concurrency.lockutils [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 706.127422] env[59369]: DEBUG nova.virt.vmwareapi.images [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] Downloading image file data dc3bad91-06a4-485e-bdc4-b0135384e686 to the data store datastore1 {{(pid=59369) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 706.182087] env[59369]: DEBUG oslo_vmware.rw_handles [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/4c4e06b0-b2b9-4e89-b9a5-38be192f12a6/dc3bad91-06a4-485e-bdc4-b0135384e686/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59369) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 706.250219] env[59369]: DEBUG oslo_vmware.rw_handles [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] Completed reading data from the image iterator. {{(pid=59369) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 706.250669] env[59369]: DEBUG oslo_vmware.rw_handles [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/4c4e06b0-b2b9-4e89-b9a5-38be192f12a6/dc3bad91-06a4-485e-bdc4-b0135384e686/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59369) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 706.496877] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ed0c19c0-1613-4435-aec2-21b703ecf8eb {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 706.506715] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a2cd270-ad17-4bda-a714-8347bc4f1ba2 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 706.535887] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db9ba436-b1dc-4621-907b-8a64498e1949 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 706.543881] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-264becbe-29db-4b1b-8404-de706b919451 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 706.557642] env[59369]: DEBUG nova.compute.provider_tree [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] Inventory has not changed in ProviderTree for provider: 6ace6145-3535-4f74-aa29-80f64a201369 {{(pid=59369) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 706.566610] env[59369]: DEBUG nova.scheduler.client.report [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] Inventory has not changed for provider 6ace6145-3535-4f74-aa29-80f64a201369 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 116, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59369) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 706.580203] env[59369]: DEBUG oslo_concurrency.lockutils [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.543s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 706.580691] env[59369]: ERROR nova.compute.manager [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 706.580691] env[59369]: Faults: ['InvalidArgument'] [ 706.580691] env[59369]: ERROR nova.compute.manager [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] Traceback (most recent call last): [ 706.580691] env[59369]: ERROR nova.compute.manager [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 706.580691] env[59369]: ERROR nova.compute.manager [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] self.driver.spawn(context, instance, image_meta, [ 706.580691] env[59369]: ERROR nova.compute.manager [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 706.580691] env[59369]: ERROR nova.compute.manager [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] self._vmops.spawn(context, instance, image_meta, injected_files, [ 706.580691] env[59369]: ERROR nova.compute.manager [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 706.580691] env[59369]: ERROR nova.compute.manager [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] self._fetch_image_if_missing(context, vi) [ 706.580691] env[59369]: ERROR nova.compute.manager [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 706.580691] env[59369]: ERROR nova.compute.manager [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] image_cache(vi, tmp_image_ds_loc) [ 706.580691] env[59369]: ERROR nova.compute.manager [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 706.580691] env[59369]: ERROR nova.compute.manager [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] vm_util.copy_virtual_disk( [ 706.580691] env[59369]: ERROR nova.compute.manager [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 706.580691] env[59369]: ERROR nova.compute.manager [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] session._wait_for_task(vmdk_copy_task) [ 706.580691] env[59369]: ERROR nova.compute.manager [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 706.580691] env[59369]: ERROR nova.compute.manager [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] return self.wait_for_task(task_ref) [ 706.580691] env[59369]: ERROR nova.compute.manager [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 706.580691] env[59369]: ERROR nova.compute.manager [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] return evt.wait() [ 706.580691] env[59369]: ERROR nova.compute.manager [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 706.580691] env[59369]: ERROR nova.compute.manager [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] result = hub.switch() [ 706.580691] env[59369]: ERROR nova.compute.manager [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 706.580691] env[59369]: ERROR nova.compute.manager [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] return self.greenlet.switch() [ 706.580691] env[59369]: ERROR nova.compute.manager [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 706.580691] env[59369]: ERROR nova.compute.manager [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] self.f(*self.args, **self.kw) [ 706.580691] env[59369]: ERROR nova.compute.manager [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 706.580691] env[59369]: ERROR nova.compute.manager [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] raise exceptions.translate_fault(task_info.error) [ 706.580691] env[59369]: ERROR nova.compute.manager [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 706.580691] env[59369]: ERROR nova.compute.manager [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] Faults: ['InvalidArgument'] [ 706.580691] env[59369]: ERROR nova.compute.manager [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] [ 706.581569] env[59369]: DEBUG nova.compute.utils [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] VimFaultException {{(pid=59369) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 706.583357] env[59369]: DEBUG nova.compute.manager [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] Build of instance b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0 was re-scheduled: A specified parameter was not correct: fileType [ 706.583357] env[59369]: Faults: ['InvalidArgument'] {{(pid=59369) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 706.583767] env[59369]: DEBUG nova.compute.manager [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] Unplugging VIFs for instance {{(pid=59369) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 706.583912] env[59369]: DEBUG nova.compute.manager [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59369) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 706.584071] env[59369]: DEBUG nova.compute.manager [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] Deallocating network for instance {{(pid=59369) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 706.584230] env[59369]: DEBUG nova.network.neutron [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] deallocate_for_instance() {{(pid=59369) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 706.911121] env[59369]: DEBUG nova.network.neutron [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] Updating instance_info_cache with network_info: [] {{(pid=59369) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 706.923627] env[59369]: INFO nova.compute.manager [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] Took 0.34 seconds to deallocate network for instance. [ 707.017145] env[59369]: INFO nova.scheduler.client.report [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] Deleted allocations for instance b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0 [ 707.033302] env[59369]: DEBUG oslo_concurrency.lockutils [None req-619b2735-6c49-4b63-9711-f475a9391176 tempest-ServerDiagnosticsTest-1143334552 tempest-ServerDiagnosticsTest-1143334552-project-member] Lock "b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 106.507s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 707.034514] env[59369]: DEBUG oslo_concurrency.lockutils [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Lock "b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 96.560s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 707.035954] env[59369]: INFO nova.compute.manager [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] [instance: b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0] During sync_power_state the instance has a pending task (spawning). Skip. [ 707.037402] env[59369]: DEBUG oslo_concurrency.lockutils [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Lock "b4c4bd0e-fba2-4e19-a1f6-c95490ed38f0" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.003s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 707.072310] env[59369]: DEBUG nova.compute.manager [None req-2dbb99fa-2d1d-4cbf-ad2c-9c865e038551 tempest-ServerTagsTestJSON-972196144 tempest-ServerTagsTestJSON-972196144-project-member] [instance: 4062c0d2-5018-4a4f-9bc6-2fbe57a92a16] Starting instance... {{(pid=59369) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 707.133019] env[59369]: DEBUG oslo_concurrency.lockutils [None req-2dbb99fa-2d1d-4cbf-ad2c-9c865e038551 tempest-ServerTagsTestJSON-972196144 tempest-ServerTagsTestJSON-972196144-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 707.133019] env[59369]: DEBUG oslo_concurrency.lockutils [None req-2dbb99fa-2d1d-4cbf-ad2c-9c865e038551 tempest-ServerTagsTestJSON-972196144 tempest-ServerTagsTestJSON-972196144-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 707.133019] env[59369]: INFO nova.compute.claims [None req-2dbb99fa-2d1d-4cbf-ad2c-9c865e038551 tempest-ServerTagsTestJSON-972196144 tempest-ServerTagsTestJSON-972196144-project-member] [instance: 4062c0d2-5018-4a4f-9bc6-2fbe57a92a16] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 707.328647] env[59369]: DEBUG oslo_concurrency.lockutils [None req-439264c9-18f0-493f-b01b-189c9bfc149d tempest-ServerShowV254Test-354582440 tempest-ServerShowV254Test-354582440-project-member] Acquiring lock "ae493999-c5ad-4714-9ff9-228f4781a2ee" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 707.329053] env[59369]: DEBUG oslo_concurrency.lockutils [None req-439264c9-18f0-493f-b01b-189c9bfc149d tempest-ServerShowV254Test-354582440 tempest-ServerShowV254Test-354582440-project-member] Lock "ae493999-c5ad-4714-9ff9-228f4781a2ee" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 707.557489] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fb5831f4-53f2-4ede-ba57-141006958e04 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 707.565980] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-14ee9130-5cc4-4556-b6f9-d691209da0e9 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 707.598344] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-de2b2d70-ca44-4844-88f8-10a616233639 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 707.606094] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bce39903-6bd1-4f4b-8812-74e393831537 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 707.621282] env[59369]: DEBUG nova.compute.provider_tree [None req-2dbb99fa-2d1d-4cbf-ad2c-9c865e038551 tempest-ServerTagsTestJSON-972196144 tempest-ServerTagsTestJSON-972196144-project-member] Inventory has not changed in ProviderTree for provider: 6ace6145-3535-4f74-aa29-80f64a201369 {{(pid=59369) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 707.632145] env[59369]: DEBUG nova.scheduler.client.report [None req-2dbb99fa-2d1d-4cbf-ad2c-9c865e038551 tempest-ServerTagsTestJSON-972196144 tempest-ServerTagsTestJSON-972196144-project-member] Inventory has not changed for provider 6ace6145-3535-4f74-aa29-80f64a201369 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 116, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59369) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 707.648662] env[59369]: DEBUG oslo_concurrency.lockutils [None req-2dbb99fa-2d1d-4cbf-ad2c-9c865e038551 tempest-ServerTagsTestJSON-972196144 tempest-ServerTagsTestJSON-972196144-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.518s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 707.649389] env[59369]: DEBUG nova.compute.manager [None req-2dbb99fa-2d1d-4cbf-ad2c-9c865e038551 tempest-ServerTagsTestJSON-972196144 tempest-ServerTagsTestJSON-972196144-project-member] [instance: 4062c0d2-5018-4a4f-9bc6-2fbe57a92a16] Start building networks asynchronously for instance. {{(pid=59369) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 707.682031] env[59369]: DEBUG nova.compute.utils [None req-2dbb99fa-2d1d-4cbf-ad2c-9c865e038551 tempest-ServerTagsTestJSON-972196144 tempest-ServerTagsTestJSON-972196144-project-member] Using /dev/sd instead of None {{(pid=59369) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 707.683148] env[59369]: DEBUG nova.compute.manager [None req-2dbb99fa-2d1d-4cbf-ad2c-9c865e038551 tempest-ServerTagsTestJSON-972196144 tempest-ServerTagsTestJSON-972196144-project-member] [instance: 4062c0d2-5018-4a4f-9bc6-2fbe57a92a16] Allocating IP information in the background. {{(pid=59369) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 707.683255] env[59369]: DEBUG nova.network.neutron [None req-2dbb99fa-2d1d-4cbf-ad2c-9c865e038551 tempest-ServerTagsTestJSON-972196144 tempest-ServerTagsTestJSON-972196144-project-member] [instance: 4062c0d2-5018-4a4f-9bc6-2fbe57a92a16] allocate_for_instance() {{(pid=59369) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 707.691881] env[59369]: DEBUG nova.compute.manager [None req-2dbb99fa-2d1d-4cbf-ad2c-9c865e038551 tempest-ServerTagsTestJSON-972196144 tempest-ServerTagsTestJSON-972196144-project-member] [instance: 4062c0d2-5018-4a4f-9bc6-2fbe57a92a16] Start building block device mappings for instance. {{(pid=59369) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 707.763864] env[59369]: DEBUG nova.compute.manager [None req-2dbb99fa-2d1d-4cbf-ad2c-9c865e038551 tempest-ServerTagsTestJSON-972196144 tempest-ServerTagsTestJSON-972196144-project-member] [instance: 4062c0d2-5018-4a4f-9bc6-2fbe57a92a16] Start spawning the instance on the hypervisor. {{(pid=59369) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 707.769666] env[59369]: DEBUG nova.policy [None req-2dbb99fa-2d1d-4cbf-ad2c-9c865e038551 tempest-ServerTagsTestJSON-972196144 tempest-ServerTagsTestJSON-972196144-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '86472ec6cf2347eabf5bd0e354cb80f0', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b9392039fe2e4d3e8cd8127761f4e713', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59369) authorize /opt/stack/nova/nova/policy.py:203}} [ 707.795343] env[59369]: DEBUG nova.virt.hardware [None req-2dbb99fa-2d1d-4cbf-ad2c-9c865e038551 tempest-ServerTagsTestJSON-972196144 tempest-ServerTagsTestJSON-972196144-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-08-29T11:50:58Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-08-29T11:50:41Z,direct_url=,disk_format='vmdk',id=dc3bad91-06a4-485e-bdc4-b0135384e686,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='7386534ab8334806bb50f77e095e084c',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-08-29T11:50:42Z,virtual_size=,visibility=), allow threads: False {{(pid=59369) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 707.795573] env[59369]: DEBUG nova.virt.hardware [None req-2dbb99fa-2d1d-4cbf-ad2c-9c865e038551 tempest-ServerTagsTestJSON-972196144 tempest-ServerTagsTestJSON-972196144-project-member] Flavor limits 0:0:0 {{(pid=59369) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 707.795722] env[59369]: DEBUG nova.virt.hardware [None req-2dbb99fa-2d1d-4cbf-ad2c-9c865e038551 tempest-ServerTagsTestJSON-972196144 tempest-ServerTagsTestJSON-972196144-project-member] Image limits 0:0:0 {{(pid=59369) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 707.795893] env[59369]: DEBUG nova.virt.hardware [None req-2dbb99fa-2d1d-4cbf-ad2c-9c865e038551 tempest-ServerTagsTestJSON-972196144 tempest-ServerTagsTestJSON-972196144-project-member] Flavor pref 0:0:0 {{(pid=59369) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 707.796042] env[59369]: DEBUG nova.virt.hardware [None req-2dbb99fa-2d1d-4cbf-ad2c-9c865e038551 tempest-ServerTagsTestJSON-972196144 tempest-ServerTagsTestJSON-972196144-project-member] Image pref 0:0:0 {{(pid=59369) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 707.796804] env[59369]: DEBUG nova.virt.hardware [None req-2dbb99fa-2d1d-4cbf-ad2c-9c865e038551 tempest-ServerTagsTestJSON-972196144 tempest-ServerTagsTestJSON-972196144-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59369) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 707.796804] env[59369]: DEBUG nova.virt.hardware [None req-2dbb99fa-2d1d-4cbf-ad2c-9c865e038551 tempest-ServerTagsTestJSON-972196144 tempest-ServerTagsTestJSON-972196144-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59369) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 707.796804] env[59369]: DEBUG nova.virt.hardware [None req-2dbb99fa-2d1d-4cbf-ad2c-9c865e038551 tempest-ServerTagsTestJSON-972196144 tempest-ServerTagsTestJSON-972196144-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59369) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 707.796804] env[59369]: DEBUG nova.virt.hardware [None req-2dbb99fa-2d1d-4cbf-ad2c-9c865e038551 tempest-ServerTagsTestJSON-972196144 tempest-ServerTagsTestJSON-972196144-project-member] Got 1 possible topologies {{(pid=59369) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 707.796804] env[59369]: DEBUG nova.virt.hardware [None req-2dbb99fa-2d1d-4cbf-ad2c-9c865e038551 tempest-ServerTagsTestJSON-972196144 tempest-ServerTagsTestJSON-972196144-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59369) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 707.797027] env[59369]: DEBUG nova.virt.hardware [None req-2dbb99fa-2d1d-4cbf-ad2c-9c865e038551 tempest-ServerTagsTestJSON-972196144 tempest-ServerTagsTestJSON-972196144-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59369) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 707.797824] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b133eb6f-d654-4cfe-b7e5-2c2a331d7824 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 707.806140] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-13358a02-8e09-4af3-ac6f-c424e1dfafc0 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 708.256833] env[59369]: DEBUG nova.network.neutron [None req-2dbb99fa-2d1d-4cbf-ad2c-9c865e038551 tempest-ServerTagsTestJSON-972196144 tempest-ServerTagsTestJSON-972196144-project-member] [instance: 4062c0d2-5018-4a4f-9bc6-2fbe57a92a16] Successfully created port: bac7f439-0178-4dbb-89e5-029a45f5eb71 {{(pid=59369) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 709.032022] env[59369]: DEBUG nova.network.neutron [None req-2dbb99fa-2d1d-4cbf-ad2c-9c865e038551 tempest-ServerTagsTestJSON-972196144 tempest-ServerTagsTestJSON-972196144-project-member] [instance: 4062c0d2-5018-4a4f-9bc6-2fbe57a92a16] Successfully updated port: bac7f439-0178-4dbb-89e5-029a45f5eb71 {{(pid=59369) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 709.041751] env[59369]: DEBUG oslo_concurrency.lockutils [None req-2dbb99fa-2d1d-4cbf-ad2c-9c865e038551 tempest-ServerTagsTestJSON-972196144 tempest-ServerTagsTestJSON-972196144-project-member] Acquiring lock "refresh_cache-4062c0d2-5018-4a4f-9bc6-2fbe57a92a16" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 709.042339] env[59369]: DEBUG oslo_concurrency.lockutils [None req-2dbb99fa-2d1d-4cbf-ad2c-9c865e038551 tempest-ServerTagsTestJSON-972196144 tempest-ServerTagsTestJSON-972196144-project-member] Acquired lock "refresh_cache-4062c0d2-5018-4a4f-9bc6-2fbe57a92a16" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 709.042339] env[59369]: DEBUG nova.network.neutron [None req-2dbb99fa-2d1d-4cbf-ad2c-9c865e038551 tempest-ServerTagsTestJSON-972196144 tempest-ServerTagsTestJSON-972196144-project-member] [instance: 4062c0d2-5018-4a4f-9bc6-2fbe57a92a16] Building network info cache for instance {{(pid=59369) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 709.113096] env[59369]: DEBUG nova.network.neutron [None req-2dbb99fa-2d1d-4cbf-ad2c-9c865e038551 tempest-ServerTagsTestJSON-972196144 tempest-ServerTagsTestJSON-972196144-project-member] [instance: 4062c0d2-5018-4a4f-9bc6-2fbe57a92a16] Instance cache missing network info. {{(pid=59369) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 709.217515] env[59369]: DEBUG nova.compute.manager [req-7b9a8a7d-1104-4dc4-a469-f3110daf36a8 req-a3e18a61-5a31-42f4-9489-7d1b009129a2 service nova] [instance: 4062c0d2-5018-4a4f-9bc6-2fbe57a92a16] Received event network-vif-plugged-bac7f439-0178-4dbb-89e5-029a45f5eb71 {{(pid=59369) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 709.217515] env[59369]: DEBUG oslo_concurrency.lockutils [req-7b9a8a7d-1104-4dc4-a469-f3110daf36a8 req-a3e18a61-5a31-42f4-9489-7d1b009129a2 service nova] Acquiring lock "4062c0d2-5018-4a4f-9bc6-2fbe57a92a16-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 709.217515] env[59369]: DEBUG oslo_concurrency.lockutils [req-7b9a8a7d-1104-4dc4-a469-f3110daf36a8 req-a3e18a61-5a31-42f4-9489-7d1b009129a2 service nova] Lock "4062c0d2-5018-4a4f-9bc6-2fbe57a92a16-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 709.217515] env[59369]: DEBUG oslo_concurrency.lockutils [req-7b9a8a7d-1104-4dc4-a469-f3110daf36a8 req-a3e18a61-5a31-42f4-9489-7d1b009129a2 service nova] Lock "4062c0d2-5018-4a4f-9bc6-2fbe57a92a16-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 709.217515] env[59369]: DEBUG nova.compute.manager [req-7b9a8a7d-1104-4dc4-a469-f3110daf36a8 req-a3e18a61-5a31-42f4-9489-7d1b009129a2 service nova] [instance: 4062c0d2-5018-4a4f-9bc6-2fbe57a92a16] No waiting events found dispatching network-vif-plugged-bac7f439-0178-4dbb-89e5-029a45f5eb71 {{(pid=59369) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 709.217515] env[59369]: WARNING nova.compute.manager [req-7b9a8a7d-1104-4dc4-a469-f3110daf36a8 req-a3e18a61-5a31-42f4-9489-7d1b009129a2 service nova] [instance: 4062c0d2-5018-4a4f-9bc6-2fbe57a92a16] Received unexpected event network-vif-plugged-bac7f439-0178-4dbb-89e5-029a45f5eb71 for instance with vm_state building and task_state spawning. [ 709.218208] env[59369]: DEBUG nova.compute.manager [req-7b9a8a7d-1104-4dc4-a469-f3110daf36a8 req-a3e18a61-5a31-42f4-9489-7d1b009129a2 service nova] [instance: 4062c0d2-5018-4a4f-9bc6-2fbe57a92a16] Received event network-changed-bac7f439-0178-4dbb-89e5-029a45f5eb71 {{(pid=59369) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 709.218470] env[59369]: DEBUG nova.compute.manager [req-7b9a8a7d-1104-4dc4-a469-f3110daf36a8 req-a3e18a61-5a31-42f4-9489-7d1b009129a2 service nova] [instance: 4062c0d2-5018-4a4f-9bc6-2fbe57a92a16] Refreshing instance network info cache due to event network-changed-bac7f439-0178-4dbb-89e5-029a45f5eb71. {{(pid=59369) external_instance_event /opt/stack/nova/nova/compute/manager.py:11007}} [ 709.218719] env[59369]: DEBUG oslo_concurrency.lockutils [req-7b9a8a7d-1104-4dc4-a469-f3110daf36a8 req-a3e18a61-5a31-42f4-9489-7d1b009129a2 service nova] Acquiring lock "refresh_cache-4062c0d2-5018-4a4f-9bc6-2fbe57a92a16" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 709.366666] env[59369]: DEBUG nova.network.neutron [None req-2dbb99fa-2d1d-4cbf-ad2c-9c865e038551 tempest-ServerTagsTestJSON-972196144 tempest-ServerTagsTestJSON-972196144-project-member] [instance: 4062c0d2-5018-4a4f-9bc6-2fbe57a92a16] Updating instance_info_cache with network_info: [{"id": "bac7f439-0178-4dbb-89e5-029a45f5eb71", "address": "fa:16:3e:df:ba:12", "network": {"id": "8beb79ab-9682-45d3-a543-6e820dad84d7", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-1032703482-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b9392039fe2e4d3e8cd8127761f4e713", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3f499bc9-78da-46c1-9274-19edf26d31cb", "external-id": "nsx-vlan-transportzone-243", "segmentation_id": 243, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapbac7f439-01", "ovs_interfaceid": "bac7f439-0178-4dbb-89e5-029a45f5eb71", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59369) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 709.380032] env[59369]: DEBUG oslo_concurrency.lockutils [None req-2dbb99fa-2d1d-4cbf-ad2c-9c865e038551 tempest-ServerTagsTestJSON-972196144 tempest-ServerTagsTestJSON-972196144-project-member] Releasing lock "refresh_cache-4062c0d2-5018-4a4f-9bc6-2fbe57a92a16" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 709.380324] env[59369]: DEBUG nova.compute.manager [None req-2dbb99fa-2d1d-4cbf-ad2c-9c865e038551 tempest-ServerTagsTestJSON-972196144 tempest-ServerTagsTestJSON-972196144-project-member] [instance: 4062c0d2-5018-4a4f-9bc6-2fbe57a92a16] Instance network_info: |[{"id": "bac7f439-0178-4dbb-89e5-029a45f5eb71", "address": "fa:16:3e:df:ba:12", "network": {"id": "8beb79ab-9682-45d3-a543-6e820dad84d7", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-1032703482-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b9392039fe2e4d3e8cd8127761f4e713", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3f499bc9-78da-46c1-9274-19edf26d31cb", "external-id": "nsx-vlan-transportzone-243", "segmentation_id": 243, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapbac7f439-01", "ovs_interfaceid": "bac7f439-0178-4dbb-89e5-029a45f5eb71", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59369) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 709.380617] env[59369]: DEBUG oslo_concurrency.lockutils [req-7b9a8a7d-1104-4dc4-a469-f3110daf36a8 req-a3e18a61-5a31-42f4-9489-7d1b009129a2 service nova] Acquired lock "refresh_cache-4062c0d2-5018-4a4f-9bc6-2fbe57a92a16" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 709.380800] env[59369]: DEBUG nova.network.neutron [req-7b9a8a7d-1104-4dc4-a469-f3110daf36a8 req-a3e18a61-5a31-42f4-9489-7d1b009129a2 service nova] [instance: 4062c0d2-5018-4a4f-9bc6-2fbe57a92a16] Refreshing network info cache for port bac7f439-0178-4dbb-89e5-029a45f5eb71 {{(pid=59369) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 709.381851] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-2dbb99fa-2d1d-4cbf-ad2c-9c865e038551 tempest-ServerTagsTestJSON-972196144 tempest-ServerTagsTestJSON-972196144-project-member] [instance: 4062c0d2-5018-4a4f-9bc6-2fbe57a92a16] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:df:ba:12', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '3f499bc9-78da-46c1-9274-19edf26d31cb', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'bac7f439-0178-4dbb-89e5-029a45f5eb71', 'vif_model': 'vmxnet3'}] {{(pid=59369) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 709.390451] env[59369]: DEBUG nova.virt.vmwareapi.vm_util [None req-2dbb99fa-2d1d-4cbf-ad2c-9c865e038551 tempest-ServerTagsTestJSON-972196144 tempest-ServerTagsTestJSON-972196144-project-member] Creating folder: Project (b9392039fe2e4d3e8cd8127761f4e713). Parent ref: group-v121837. {{(pid=59369) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 709.391784] env[59369]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-fcf9cce2-cc29-415a-b3f0-67ca2fb9d205 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 709.413937] env[59369]: INFO nova.virt.vmwareapi.vm_util [None req-2dbb99fa-2d1d-4cbf-ad2c-9c865e038551 tempest-ServerTagsTestJSON-972196144 tempest-ServerTagsTestJSON-972196144-project-member] Created folder: Project (b9392039fe2e4d3e8cd8127761f4e713) in parent group-v121837. [ 709.414517] env[59369]: DEBUG nova.virt.vmwareapi.vm_util [None req-2dbb99fa-2d1d-4cbf-ad2c-9c865e038551 tempest-ServerTagsTestJSON-972196144 tempest-ServerTagsTestJSON-972196144-project-member] Creating folder: Instances. Parent ref: group-v121881. {{(pid=59369) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 709.414791] env[59369]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-690bdb86-abe4-404f-be3c-783797ff975e {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 709.424857] env[59369]: INFO nova.virt.vmwareapi.vm_util [None req-2dbb99fa-2d1d-4cbf-ad2c-9c865e038551 tempest-ServerTagsTestJSON-972196144 tempest-ServerTagsTestJSON-972196144-project-member] Created folder: Instances in parent group-v121881. [ 709.424857] env[59369]: DEBUG oslo.service.loopingcall [None req-2dbb99fa-2d1d-4cbf-ad2c-9c865e038551 tempest-ServerTagsTestJSON-972196144 tempest-ServerTagsTestJSON-972196144-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59369) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 709.424982] env[59369]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 4062c0d2-5018-4a4f-9bc6-2fbe57a92a16] Creating VM on the ESX host {{(pid=59369) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 709.425139] env[59369]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-d65067b8-0155-4917-98aa-6f407e797e9d {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 709.451950] env[59369]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 709.451950] env[59369]: value = "task-463292" [ 709.451950] env[59369]: _type = "Task" [ 709.451950] env[59369]: } to complete. {{(pid=59369) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 709.459980] env[59369]: DEBUG oslo_vmware.api [-] Task: {'id': task-463292, 'name': CreateVM_Task} progress is 0%. {{(pid=59369) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 709.671386] env[59369]: DEBUG oslo_concurrency.lockutils [None req-ce2a0f8b-07d6-4b62-9fb8-88d4c08936eb tempest-ServerDiagnosticsV248Test-570916225 tempest-ServerDiagnosticsV248Test-570916225-project-member] Acquiring lock "c7b6f4c8-be99-47b6-9bb9-345ad738312a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 709.671621] env[59369]: DEBUG oslo_concurrency.lockutils [None req-ce2a0f8b-07d6-4b62-9fb8-88d4c08936eb tempest-ServerDiagnosticsV248Test-570916225 tempest-ServerDiagnosticsV248Test-570916225-project-member] Lock "c7b6f4c8-be99-47b6-9bb9-345ad738312a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 709.759718] env[59369]: DEBUG nova.network.neutron [req-7b9a8a7d-1104-4dc4-a469-f3110daf36a8 req-a3e18a61-5a31-42f4-9489-7d1b009129a2 service nova] [instance: 4062c0d2-5018-4a4f-9bc6-2fbe57a92a16] Updated VIF entry in instance network info cache for port bac7f439-0178-4dbb-89e5-029a45f5eb71. {{(pid=59369) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 709.759718] env[59369]: DEBUG nova.network.neutron [req-7b9a8a7d-1104-4dc4-a469-f3110daf36a8 req-a3e18a61-5a31-42f4-9489-7d1b009129a2 service nova] [instance: 4062c0d2-5018-4a4f-9bc6-2fbe57a92a16] Updating instance_info_cache with network_info: [{"id": "bac7f439-0178-4dbb-89e5-029a45f5eb71", "address": "fa:16:3e:df:ba:12", "network": {"id": "8beb79ab-9682-45d3-a543-6e820dad84d7", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-1032703482-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b9392039fe2e4d3e8cd8127761f4e713", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3f499bc9-78da-46c1-9274-19edf26d31cb", "external-id": "nsx-vlan-transportzone-243", "segmentation_id": 243, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapbac7f439-01", "ovs_interfaceid": "bac7f439-0178-4dbb-89e5-029a45f5eb71", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59369) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 709.770879] env[59369]: DEBUG oslo_concurrency.lockutils [req-7b9a8a7d-1104-4dc4-a469-f3110daf36a8 req-a3e18a61-5a31-42f4-9489-7d1b009129a2 service nova] Releasing lock "refresh_cache-4062c0d2-5018-4a4f-9bc6-2fbe57a92a16" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 709.962618] env[59369]: DEBUG oslo_vmware.api [-] Task: {'id': task-463292, 'name': CreateVM_Task, 'duration_secs': 0.305908} completed successfully. {{(pid=59369) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 709.962740] env[59369]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 4062c0d2-5018-4a4f-9bc6-2fbe57a92a16] Created VM on the ESX host {{(pid=59369) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 709.963414] env[59369]: DEBUG oslo_concurrency.lockutils [None req-2dbb99fa-2d1d-4cbf-ad2c-9c865e038551 tempest-ServerTagsTestJSON-972196144 tempest-ServerTagsTestJSON-972196144-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/dc3bad91-06a4-485e-bdc4-b0135384e686" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 709.963572] env[59369]: DEBUG oslo_concurrency.lockutils [None req-2dbb99fa-2d1d-4cbf-ad2c-9c865e038551 tempest-ServerTagsTestJSON-972196144 tempest-ServerTagsTestJSON-972196144-project-member] Acquired lock "[datastore1] devstack-image-cache_base/dc3bad91-06a4-485e-bdc4-b0135384e686" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 709.963875] env[59369]: DEBUG oslo_concurrency.lockutils [None req-2dbb99fa-2d1d-4cbf-ad2c-9c865e038551 tempest-ServerTagsTestJSON-972196144 tempest-ServerTagsTestJSON-972196144-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/dc3bad91-06a4-485e-bdc4-b0135384e686" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 709.964128] env[59369]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ae7a04b0-6949-480d-88d8-df2e014a09de {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 709.969134] env[59369]: DEBUG oslo_vmware.api [None req-2dbb99fa-2d1d-4cbf-ad2c-9c865e038551 tempest-ServerTagsTestJSON-972196144 tempest-ServerTagsTestJSON-972196144-project-member] Waiting for the task: (returnval){ [ 709.969134] env[59369]: value = "session[52794835-9974-5e05-43d4-e917bd2adbc9]527bdf04-9065-d66f-1d10-21faf16d2603" [ 709.969134] env[59369]: _type = "Task" [ 709.969134] env[59369]: } to complete. {{(pid=59369) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 709.982537] env[59369]: DEBUG oslo_concurrency.lockutils [None req-2dbb99fa-2d1d-4cbf-ad2c-9c865e038551 tempest-ServerTagsTestJSON-972196144 tempest-ServerTagsTestJSON-972196144-project-member] Releasing lock "[datastore1] devstack-image-cache_base/dc3bad91-06a4-485e-bdc4-b0135384e686" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 709.983071] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-2dbb99fa-2d1d-4cbf-ad2c-9c865e038551 tempest-ServerTagsTestJSON-972196144 tempest-ServerTagsTestJSON-972196144-project-member] [instance: 4062c0d2-5018-4a4f-9bc6-2fbe57a92a16] Processing image dc3bad91-06a4-485e-bdc4-b0135384e686 {{(pid=59369) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 709.983071] env[59369]: DEBUG oslo_concurrency.lockutils [None req-2dbb99fa-2d1d-4cbf-ad2c-9c865e038551 tempest-ServerTagsTestJSON-972196144 tempest-ServerTagsTestJSON-972196144-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/dc3bad91-06a4-485e-bdc4-b0135384e686/dc3bad91-06a4-485e-bdc4-b0135384e686.vmdk" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 718.174911] env[59369]: INFO nova.compute.manager [None req-20b552f5-adc9-43d0-9739-7d188ec35348 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Rebuilding instance [ 718.208573] env[59369]: DEBUG nova.objects.instance [None req-20b552f5-adc9-43d0-9739-7d188ec35348 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Lazy-loading 'trusted_certs' on Instance uuid fe4378ed-29fd-4b4e-8943-80c833830357 {{(pid=59369) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1105}} [ 718.222373] env[59369]: DEBUG nova.compute.manager [None req-20b552f5-adc9-43d0-9739-7d188ec35348 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Checking state {{(pid=59369) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} [ 718.223240] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c7a2b370-8e73-493b-85f4-cfe066130cce {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 718.279825] env[59369]: DEBUG nova.objects.instance [None req-20b552f5-adc9-43d0-9739-7d188ec35348 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Lazy-loading 'pci_requests' on Instance uuid fe4378ed-29fd-4b4e-8943-80c833830357 {{(pid=59369) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1105}} [ 718.287502] env[59369]: DEBUG nova.objects.instance [None req-20b552f5-adc9-43d0-9739-7d188ec35348 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Lazy-loading 'pci_devices' on Instance uuid fe4378ed-29fd-4b4e-8943-80c833830357 {{(pid=59369) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1105}} [ 718.295812] env[59369]: DEBUG nova.objects.instance [None req-20b552f5-adc9-43d0-9739-7d188ec35348 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Lazy-loading 'resources' on Instance uuid fe4378ed-29fd-4b4e-8943-80c833830357 {{(pid=59369) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1105}} [ 718.302279] env[59369]: DEBUG nova.objects.instance [None req-20b552f5-adc9-43d0-9739-7d188ec35348 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Lazy-loading 'migration_context' on Instance uuid fe4378ed-29fd-4b4e-8943-80c833830357 {{(pid=59369) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1105}} [ 718.309042] env[59369]: DEBUG nova.objects.instance [None req-20b552f5-adc9-43d0-9739-7d188ec35348 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Trying to apply a migration context that does not seem to be set for this instance {{(pid=59369) apply_migration_context /opt/stack/nova/nova/objects/instance.py:1032}} [ 718.309439] env[59369]: DEBUG nova.virt.vmwareapi.vm_util [None req-20b552f5-adc9-43d0-9739-7d188ec35348 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Powering off the VM {{(pid=59369) power_off_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1502}} [ 718.309677] env[59369]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.PowerOffVM_Task with opID=oslo.vmware-c6fdd533-75a4-4d90-8913-1b05f8bbc5bf {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 718.317573] env[59369]: DEBUG oslo_vmware.api [None req-20b552f5-adc9-43d0-9739-7d188ec35348 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Waiting for the task: (returnval){ [ 718.317573] env[59369]: value = "task-463293" [ 718.317573] env[59369]: _type = "Task" [ 718.317573] env[59369]: } to complete. {{(pid=59369) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 718.326408] env[59369]: DEBUG oslo_vmware.api [None req-20b552f5-adc9-43d0-9739-7d188ec35348 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Task: {'id': task-463293, 'name': PowerOffVM_Task} progress is 0%. {{(pid=59369) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 718.827290] env[59369]: DEBUG oslo_vmware.api [None req-20b552f5-adc9-43d0-9739-7d188ec35348 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Task: {'id': task-463293, 'name': PowerOffVM_Task, 'duration_secs': 0.173224} completed successfully. {{(pid=59369) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 718.827537] env[59369]: DEBUG nova.virt.vmwareapi.vm_util [None req-20b552f5-adc9-43d0-9739-7d188ec35348 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Powered off the VM {{(pid=59369) power_off_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1507}} [ 718.828214] env[59369]: DEBUG nova.virt.vmwareapi.vm_util [None req-20b552f5-adc9-43d0-9739-7d188ec35348 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Powering off the VM {{(pid=59369) power_off_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1502}} [ 718.828449] env[59369]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.PowerOffVM_Task with opID=oslo.vmware-2c9e1eb6-5120-40e6-9338-7eee92a79845 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 718.834905] env[59369]: DEBUG oslo_vmware.api [None req-20b552f5-adc9-43d0-9739-7d188ec35348 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Waiting for the task: (returnval){ [ 718.834905] env[59369]: value = "task-463294" [ 718.834905] env[59369]: _type = "Task" [ 718.834905] env[59369]: } to complete. {{(pid=59369) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 718.844106] env[59369]: DEBUG nova.virt.vmwareapi.vm_util [None req-20b552f5-adc9-43d0-9739-7d188ec35348 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] VM already powered off {{(pid=59369) power_off_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1509}} [ 718.844334] env[59369]: DEBUG nova.virt.vmwareapi.volumeops [None req-20b552f5-adc9-43d0-9739-7d188ec35348 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Volume detach. Driver type: vmdk {{(pid=59369) detach_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:646}} [ 718.844477] env[59369]: DEBUG nova.virt.vmwareapi.volumeops [None req-20b552f5-adc9-43d0-9739-7d188ec35348 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] _detach_volume_vmdk: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-121872', 'volume_id': '3a1b2e8d-d5a5-493f-907f-3cfc470a56e3', 'name': 'volume-3a1b2e8d-d5a5-493f-907f-3cfc470a56e3', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': 'fe4378ed-29fd-4b4e-8943-80c833830357', 'attached_at': '', 'detached_at': '', 'volume_id': '3a1b2e8d-d5a5-493f-907f-3cfc470a56e3', 'serial': '3a1b2e8d-d5a5-493f-907f-3cfc470a56e3'} {{(pid=59369) _detach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:571}} [ 718.845256] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e6404b50-6f8d-4e5b-9461-8e5df93c4943 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 718.864332] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bbed1ab4-a46e-42e0-a1f8-3c8a1aee1a42 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 718.870241] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b08ba1c2-5974-43b9-bd47-c2b2f884983f {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 718.887179] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7e30d9ae-8e03-4000-b650-84c26b8779c0 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 718.901828] env[59369]: DEBUG nova.virt.vmwareapi.volumeops [None req-20b552f5-adc9-43d0-9739-7d188ec35348 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] The volume has not been displaced from its original location: [datastore1] volume-3a1b2e8d-d5a5-493f-907f-3cfc470a56e3/volume-3a1b2e8d-d5a5-493f-907f-3cfc470a56e3.vmdk. No consolidation needed. {{(pid=59369) _consolidate_vmdk_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:504}} [ 718.909111] env[59369]: DEBUG nova.virt.vmwareapi.volumeops [None req-20b552f5-adc9-43d0-9739-7d188ec35348 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Reconfiguring VM instance instance-0000000b to detach disk 2000 {{(pid=59369) detach_disk_from_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:122}} [ 718.909111] env[59369]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-36a53e8f-daf5-418b-9fbf-0e87f4954794 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 718.924143] env[59369]: DEBUG oslo_vmware.api [None req-20b552f5-adc9-43d0-9739-7d188ec35348 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Waiting for the task: (returnval){ [ 718.924143] env[59369]: value = "task-463295" [ 718.924143] env[59369]: _type = "Task" [ 718.924143] env[59369]: } to complete. {{(pid=59369) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 718.933386] env[59369]: DEBUG oslo_vmware.api [None req-20b552f5-adc9-43d0-9739-7d188ec35348 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Task: {'id': task-463295, 'name': ReconfigVM_Task} progress is 6%. {{(pid=59369) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 719.434168] env[59369]: DEBUG oslo_vmware.api [None req-20b552f5-adc9-43d0-9739-7d188ec35348 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Task: {'id': task-463295, 'name': ReconfigVM_Task, 'duration_secs': 0.139361} completed successfully. {{(pid=59369) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 719.434455] env[59369]: DEBUG nova.virt.vmwareapi.volumeops [None req-20b552f5-adc9-43d0-9739-7d188ec35348 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Reconfigured VM instance instance-0000000b to detach disk 2000 {{(pid=59369) detach_disk_from_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:127}} [ 719.439079] env[59369]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-acc71468-b09a-4d09-a632-078b2e40e510 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 719.453585] env[59369]: DEBUG oslo_vmware.api [None req-20b552f5-adc9-43d0-9739-7d188ec35348 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Waiting for the task: (returnval){ [ 719.453585] env[59369]: value = "task-463296" [ 719.453585] env[59369]: _type = "Task" [ 719.453585] env[59369]: } to complete. {{(pid=59369) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 719.461701] env[59369]: DEBUG oslo_vmware.api [None req-20b552f5-adc9-43d0-9739-7d188ec35348 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Task: {'id': task-463296, 'name': ReconfigVM_Task} progress is 5%. {{(pid=59369) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 719.963976] env[59369]: DEBUG oslo_vmware.api [None req-20b552f5-adc9-43d0-9739-7d188ec35348 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Task: {'id': task-463296, 'name': ReconfigVM_Task, 'duration_secs': 0.131283} completed successfully. {{(pid=59369) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 719.963976] env[59369]: DEBUG nova.virt.vmwareapi.volumeops [None req-20b552f5-adc9-43d0-9739-7d188ec35348 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Detached VMDK: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-121872', 'volume_id': '3a1b2e8d-d5a5-493f-907f-3cfc470a56e3', 'name': 'volume-3a1b2e8d-d5a5-493f-907f-3cfc470a56e3', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': 'fe4378ed-29fd-4b4e-8943-80c833830357', 'attached_at': '', 'detached_at': '', 'volume_id': '3a1b2e8d-d5a5-493f-907f-3cfc470a56e3', 'serial': '3a1b2e8d-d5a5-493f-907f-3cfc470a56e3'} {{(pid=59369) _detach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:605}} [ 719.963976] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-20b552f5-adc9-43d0-9739-7d188ec35348 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Destroying instance {{(pid=59369) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 719.964683] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-729a8cae-c28c-4ece-bbc2-d569e6cdae86 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 719.971222] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-20b552f5-adc9-43d0-9739-7d188ec35348 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Unregistering the VM {{(pid=59369) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 719.971433] env[59369]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-dcce8fe6-bbd3-4d92-a3e5-f308b36c6a9f {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 720.029993] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-20b552f5-adc9-43d0-9739-7d188ec35348 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Unregistered the VM {{(pid=59369) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 720.030236] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-20b552f5-adc9-43d0-9739-7d188ec35348 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Deleting contents of the VM from datastore datastore1 {{(pid=59369) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 720.030416] env[59369]: DEBUG nova.virt.vmwareapi.ds_util [None req-20b552f5-adc9-43d0-9739-7d188ec35348 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Deleting the datastore file [datastore1] fe4378ed-29fd-4b4e-8943-80c833830357 {{(pid=59369) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 720.030665] env[59369]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-0d4242fa-6c60-4498-babb-37d264dd7c15 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 720.037124] env[59369]: DEBUG oslo_vmware.api [None req-20b552f5-adc9-43d0-9739-7d188ec35348 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Waiting for the task: (returnval){ [ 720.037124] env[59369]: value = "task-463298" [ 720.037124] env[59369]: _type = "Task" [ 720.037124] env[59369]: } to complete. {{(pid=59369) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 720.045266] env[59369]: DEBUG oslo_vmware.api [None req-20b552f5-adc9-43d0-9739-7d188ec35348 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Task: {'id': task-463298, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59369) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 720.546867] env[59369]: DEBUG oslo_vmware.api [None req-20b552f5-adc9-43d0-9739-7d188ec35348 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Task: {'id': task-463298, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.076449} completed successfully. {{(pid=59369) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 720.547275] env[59369]: DEBUG nova.virt.vmwareapi.ds_util [None req-20b552f5-adc9-43d0-9739-7d188ec35348 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Deleted the datastore file {{(pid=59369) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 720.547275] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-20b552f5-adc9-43d0-9739-7d188ec35348 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Deleted contents of the VM from datastore datastore1 {{(pid=59369) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 720.547433] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-20b552f5-adc9-43d0-9739-7d188ec35348 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Instance destroyed {{(pid=59369) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 720.599631] env[59369]: DEBUG nova.virt.vmwareapi.volumeops [None req-20b552f5-adc9-43d0-9739-7d188ec35348 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Volume detach. Driver type: vmdk {{(pid=59369) detach_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:646}} [ 720.599962] env[59369]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-1f051b8c-0484-45d9-a650-af0e7174c3ed {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 720.608164] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c17d318c-e2a2-4bd7-ae5a-06ac67aab9c1 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 720.638700] env[59369]: ERROR nova.compute.manager [None req-20b552f5-adc9-43d0-9739-7d188ec35348 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Failed to detach volume 3a1b2e8d-d5a5-493f-907f-3cfc470a56e3 from /dev/sda: nova.exception.InstanceNotFound: Instance fe4378ed-29fd-4b4e-8943-80c833830357 could not be found. [ 720.638700] env[59369]: ERROR nova.compute.manager [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Traceback (most recent call last): [ 720.638700] env[59369]: ERROR nova.compute.manager [instance: fe4378ed-29fd-4b4e-8943-80c833830357] File "/opt/stack/nova/nova/compute/manager.py", line 4100, in _do_rebuild_instance [ 720.638700] env[59369]: ERROR nova.compute.manager [instance: fe4378ed-29fd-4b4e-8943-80c833830357] self.driver.rebuild(**kwargs) [ 720.638700] env[59369]: ERROR nova.compute.manager [instance: fe4378ed-29fd-4b4e-8943-80c833830357] File "/opt/stack/nova/nova/virt/driver.py", line 378, in rebuild [ 720.638700] env[59369]: ERROR nova.compute.manager [instance: fe4378ed-29fd-4b4e-8943-80c833830357] raise NotImplementedError() [ 720.638700] env[59369]: ERROR nova.compute.manager [instance: fe4378ed-29fd-4b4e-8943-80c833830357] NotImplementedError [ 720.638700] env[59369]: ERROR nova.compute.manager [instance: fe4378ed-29fd-4b4e-8943-80c833830357] [ 720.638700] env[59369]: ERROR nova.compute.manager [instance: fe4378ed-29fd-4b4e-8943-80c833830357] During handling of the above exception, another exception occurred: [ 720.638700] env[59369]: ERROR nova.compute.manager [instance: fe4378ed-29fd-4b4e-8943-80c833830357] [ 720.638700] env[59369]: ERROR nova.compute.manager [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Traceback (most recent call last): [ 720.638700] env[59369]: ERROR nova.compute.manager [instance: fe4378ed-29fd-4b4e-8943-80c833830357] File "/opt/stack/nova/nova/compute/manager.py", line 3535, in _detach_root_volume [ 720.638700] env[59369]: ERROR nova.compute.manager [instance: fe4378ed-29fd-4b4e-8943-80c833830357] self.driver.detach_volume(context, old_connection_info, [ 720.638700] env[59369]: ERROR nova.compute.manager [instance: fe4378ed-29fd-4b4e-8943-80c833830357] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 542, in detach_volume [ 720.638700] env[59369]: ERROR nova.compute.manager [instance: fe4378ed-29fd-4b4e-8943-80c833830357] return self._volumeops.detach_volume(connection_info, instance) [ 720.638700] env[59369]: ERROR nova.compute.manager [instance: fe4378ed-29fd-4b4e-8943-80c833830357] File "/opt/stack/nova/nova/virt/vmwareapi/volumeops.py", line 649, in detach_volume [ 720.638700] env[59369]: ERROR nova.compute.manager [instance: fe4378ed-29fd-4b4e-8943-80c833830357] self._detach_volume_vmdk(connection_info, instance) [ 720.638700] env[59369]: ERROR nova.compute.manager [instance: fe4378ed-29fd-4b4e-8943-80c833830357] File "/opt/stack/nova/nova/virt/vmwareapi/volumeops.py", line 569, in _detach_volume_vmdk [ 720.638700] env[59369]: ERROR nova.compute.manager [instance: fe4378ed-29fd-4b4e-8943-80c833830357] vm_ref = vm_util.get_vm_ref(self._session, instance) [ 720.638700] env[59369]: ERROR nova.compute.manager [instance: fe4378ed-29fd-4b4e-8943-80c833830357] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1135, in get_vm_ref [ 720.638700] env[59369]: ERROR nova.compute.manager [instance: fe4378ed-29fd-4b4e-8943-80c833830357] stable_ref.fetch_moref(session) [ 720.638700] env[59369]: ERROR nova.compute.manager [instance: fe4378ed-29fd-4b4e-8943-80c833830357] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1126, in fetch_moref [ 720.638700] env[59369]: ERROR nova.compute.manager [instance: fe4378ed-29fd-4b4e-8943-80c833830357] raise exception.InstanceNotFound(instance_id=self._uuid) [ 720.638700] env[59369]: ERROR nova.compute.manager [instance: fe4378ed-29fd-4b4e-8943-80c833830357] nova.exception.InstanceNotFound: Instance fe4378ed-29fd-4b4e-8943-80c833830357 could not be found. [ 720.638700] env[59369]: ERROR nova.compute.manager [instance: fe4378ed-29fd-4b4e-8943-80c833830357] [ 720.764323] env[59369]: DEBUG nova.compute.utils [None req-20b552f5-adc9-43d0-9739-7d188ec35348 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Build of instance fe4378ed-29fd-4b4e-8943-80c833830357 aborted: Failed to rebuild volume backed instance. {{(pid=59369) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 720.766745] env[59369]: ERROR nova.compute.manager [None req-20b552f5-adc9-43d0-9739-7d188ec35348 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Setting instance vm_state to ERROR: nova.exception.BuildAbortException: Build of instance fe4378ed-29fd-4b4e-8943-80c833830357 aborted: Failed to rebuild volume backed instance. [ 720.766745] env[59369]: ERROR nova.compute.manager [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Traceback (most recent call last): [ 720.766745] env[59369]: ERROR nova.compute.manager [instance: fe4378ed-29fd-4b4e-8943-80c833830357] File "/opt/stack/nova/nova/compute/manager.py", line 4100, in _do_rebuild_instance [ 720.766745] env[59369]: ERROR nova.compute.manager [instance: fe4378ed-29fd-4b4e-8943-80c833830357] self.driver.rebuild(**kwargs) [ 720.766745] env[59369]: ERROR nova.compute.manager [instance: fe4378ed-29fd-4b4e-8943-80c833830357] File "/opt/stack/nova/nova/virt/driver.py", line 378, in rebuild [ 720.766745] env[59369]: ERROR nova.compute.manager [instance: fe4378ed-29fd-4b4e-8943-80c833830357] raise NotImplementedError() [ 720.766745] env[59369]: ERROR nova.compute.manager [instance: fe4378ed-29fd-4b4e-8943-80c833830357] NotImplementedError [ 720.766745] env[59369]: ERROR nova.compute.manager [instance: fe4378ed-29fd-4b4e-8943-80c833830357] [ 720.766745] env[59369]: ERROR nova.compute.manager [instance: fe4378ed-29fd-4b4e-8943-80c833830357] During handling of the above exception, another exception occurred: [ 720.766745] env[59369]: ERROR nova.compute.manager [instance: fe4378ed-29fd-4b4e-8943-80c833830357] [ 720.766745] env[59369]: ERROR nova.compute.manager [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Traceback (most recent call last): [ 720.766745] env[59369]: ERROR nova.compute.manager [instance: fe4378ed-29fd-4b4e-8943-80c833830357] File "/opt/stack/nova/nova/compute/manager.py", line 3570, in _rebuild_volume_backed_instance [ 720.766745] env[59369]: ERROR nova.compute.manager [instance: fe4378ed-29fd-4b4e-8943-80c833830357] self._detach_root_volume(context, instance, root_bdm) [ 720.766745] env[59369]: ERROR nova.compute.manager [instance: fe4378ed-29fd-4b4e-8943-80c833830357] File "/opt/stack/nova/nova/compute/manager.py", line 3549, in _detach_root_volume [ 720.766745] env[59369]: ERROR nova.compute.manager [instance: fe4378ed-29fd-4b4e-8943-80c833830357] with excutils.save_and_reraise_exception(): [ 720.766745] env[59369]: ERROR nova.compute.manager [instance: fe4378ed-29fd-4b4e-8943-80c833830357] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 720.766745] env[59369]: ERROR nova.compute.manager [instance: fe4378ed-29fd-4b4e-8943-80c833830357] self.force_reraise() [ 720.766745] env[59369]: ERROR nova.compute.manager [instance: fe4378ed-29fd-4b4e-8943-80c833830357] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 720.766745] env[59369]: ERROR nova.compute.manager [instance: fe4378ed-29fd-4b4e-8943-80c833830357] raise self.value [ 720.766745] env[59369]: ERROR nova.compute.manager [instance: fe4378ed-29fd-4b4e-8943-80c833830357] File "/opt/stack/nova/nova/compute/manager.py", line 3535, in _detach_root_volume [ 720.766745] env[59369]: ERROR nova.compute.manager [instance: fe4378ed-29fd-4b4e-8943-80c833830357] self.driver.detach_volume(context, old_connection_info, [ 720.766745] env[59369]: ERROR nova.compute.manager [instance: fe4378ed-29fd-4b4e-8943-80c833830357] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 542, in detach_volume [ 720.766745] env[59369]: ERROR nova.compute.manager [instance: fe4378ed-29fd-4b4e-8943-80c833830357] return self._volumeops.detach_volume(connection_info, instance) [ 720.766745] env[59369]: ERROR nova.compute.manager [instance: fe4378ed-29fd-4b4e-8943-80c833830357] File "/opt/stack/nova/nova/virt/vmwareapi/volumeops.py", line 649, in detach_volume [ 720.766745] env[59369]: ERROR nova.compute.manager [instance: fe4378ed-29fd-4b4e-8943-80c833830357] self._detach_volume_vmdk(connection_info, instance) [ 720.766745] env[59369]: ERROR nova.compute.manager [instance: fe4378ed-29fd-4b4e-8943-80c833830357] File "/opt/stack/nova/nova/virt/vmwareapi/volumeops.py", line 569, in _detach_volume_vmdk [ 720.766745] env[59369]: ERROR nova.compute.manager [instance: fe4378ed-29fd-4b4e-8943-80c833830357] vm_ref = vm_util.get_vm_ref(self._session, instance) [ 720.766745] env[59369]: ERROR nova.compute.manager [instance: fe4378ed-29fd-4b4e-8943-80c833830357] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1135, in get_vm_ref [ 720.766745] env[59369]: ERROR nova.compute.manager [instance: fe4378ed-29fd-4b4e-8943-80c833830357] stable_ref.fetch_moref(session) [ 720.766745] env[59369]: ERROR nova.compute.manager [instance: fe4378ed-29fd-4b4e-8943-80c833830357] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1126, in fetch_moref [ 720.766745] env[59369]: ERROR nova.compute.manager [instance: fe4378ed-29fd-4b4e-8943-80c833830357] raise exception.InstanceNotFound(instance_id=self._uuid) [ 720.766745] env[59369]: ERROR nova.compute.manager [instance: fe4378ed-29fd-4b4e-8943-80c833830357] nova.exception.InstanceNotFound: Instance fe4378ed-29fd-4b4e-8943-80c833830357 could not be found. [ 720.766745] env[59369]: ERROR nova.compute.manager [instance: fe4378ed-29fd-4b4e-8943-80c833830357] [ 720.766745] env[59369]: ERROR nova.compute.manager [instance: fe4378ed-29fd-4b4e-8943-80c833830357] During handling of the above exception, another exception occurred: [ 720.767936] env[59369]: ERROR nova.compute.manager [instance: fe4378ed-29fd-4b4e-8943-80c833830357] [ 720.767936] env[59369]: ERROR nova.compute.manager [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Traceback (most recent call last): [ 720.767936] env[59369]: ERROR nova.compute.manager [instance: fe4378ed-29fd-4b4e-8943-80c833830357] File "/opt/stack/nova/nova/compute/manager.py", line 10736, in _error_out_instance_on_exception [ 720.767936] env[59369]: ERROR nova.compute.manager [instance: fe4378ed-29fd-4b4e-8943-80c833830357] yield [ 720.767936] env[59369]: ERROR nova.compute.manager [instance: fe4378ed-29fd-4b4e-8943-80c833830357] File "/opt/stack/nova/nova/compute/manager.py", line 3826, in rebuild_instance [ 720.767936] env[59369]: ERROR nova.compute.manager [instance: fe4378ed-29fd-4b4e-8943-80c833830357] self._do_rebuild_instance_with_claim( [ 720.767936] env[59369]: ERROR nova.compute.manager [instance: fe4378ed-29fd-4b4e-8943-80c833830357] File "/opt/stack/nova/nova/compute/manager.py", line 3912, in _do_rebuild_instance_with_claim [ 720.767936] env[59369]: ERROR nova.compute.manager [instance: fe4378ed-29fd-4b4e-8943-80c833830357] self._do_rebuild_instance( [ 720.767936] env[59369]: ERROR nova.compute.manager [instance: fe4378ed-29fd-4b4e-8943-80c833830357] File "/opt/stack/nova/nova/compute/manager.py", line 4104, in _do_rebuild_instance [ 720.767936] env[59369]: ERROR nova.compute.manager [instance: fe4378ed-29fd-4b4e-8943-80c833830357] self._rebuild_default_impl(**kwargs) [ 720.767936] env[59369]: ERROR nova.compute.manager [instance: fe4378ed-29fd-4b4e-8943-80c833830357] File "/opt/stack/nova/nova/compute/manager.py", line 3693, in _rebuild_default_impl [ 720.767936] env[59369]: ERROR nova.compute.manager [instance: fe4378ed-29fd-4b4e-8943-80c833830357] self._rebuild_volume_backed_instance( [ 720.767936] env[59369]: ERROR nova.compute.manager [instance: fe4378ed-29fd-4b4e-8943-80c833830357] File "/opt/stack/nova/nova/compute/manager.py", line 3585, in _rebuild_volume_backed_instance [ 720.767936] env[59369]: ERROR nova.compute.manager [instance: fe4378ed-29fd-4b4e-8943-80c833830357] raise exception.BuildAbortException( [ 720.767936] env[59369]: ERROR nova.compute.manager [instance: fe4378ed-29fd-4b4e-8943-80c833830357] nova.exception.BuildAbortException: Build of instance fe4378ed-29fd-4b4e-8943-80c833830357 aborted: Failed to rebuild volume backed instance. [ 720.767936] env[59369]: ERROR nova.compute.manager [instance: fe4378ed-29fd-4b4e-8943-80c833830357] [ 720.867018] env[59369]: DEBUG oslo_concurrency.lockutils [None req-20b552f5-adc9-43d0-9739-7d188ec35348 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 720.867018] env[59369]: DEBUG oslo_concurrency.lockutils [None req-20b552f5-adc9-43d0-9739-7d188ec35348 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 721.171669] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-04dde50b-16a4-488a-a333-adce1fc061a8 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 721.180823] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8d5f9aa8-877d-4315-9925-a1731f101b26 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 721.210017] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1763c05c-3437-48e3-abdb-841927707b4b {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 721.217059] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-03a10e22-7dce-4db0-b8bb-2d3b780f54b5 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 721.229731] env[59369]: DEBUG nova.compute.provider_tree [None req-20b552f5-adc9-43d0-9739-7d188ec35348 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Inventory has not changed in ProviderTree for provider: 6ace6145-3535-4f74-aa29-80f64a201369 {{(pid=59369) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 721.238557] env[59369]: DEBUG nova.scheduler.client.report [None req-20b552f5-adc9-43d0-9739-7d188ec35348 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Inventory has not changed for provider 6ace6145-3535-4f74-aa29-80f64a201369 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 116, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59369) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 721.254196] env[59369]: DEBUG oslo_concurrency.lockutils [None req-20b552f5-adc9-43d0-9739-7d188ec35348 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.388s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 721.254414] env[59369]: INFO nova.compute.manager [None req-20b552f5-adc9-43d0-9739-7d188ec35348 tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Successfully reverted task state from rebuilding on failure for instance. [ 721.983533] env[59369]: DEBUG oslo_concurrency.lockutils [None req-1267c56d-80e9-409a-9256-ceb9465e892f tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Acquiring lock "fe4378ed-29fd-4b4e-8943-80c833830357" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 721.983799] env[59369]: DEBUG oslo_concurrency.lockutils [None req-1267c56d-80e9-409a-9256-ceb9465e892f tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Lock "fe4378ed-29fd-4b4e-8943-80c833830357" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 721.984070] env[59369]: DEBUG oslo_concurrency.lockutils [None req-1267c56d-80e9-409a-9256-ceb9465e892f tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Acquiring lock "fe4378ed-29fd-4b4e-8943-80c833830357-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 721.984220] env[59369]: DEBUG oslo_concurrency.lockutils [None req-1267c56d-80e9-409a-9256-ceb9465e892f tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Lock "fe4378ed-29fd-4b4e-8943-80c833830357-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 721.984423] env[59369]: DEBUG oslo_concurrency.lockutils [None req-1267c56d-80e9-409a-9256-ceb9465e892f tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Lock "fe4378ed-29fd-4b4e-8943-80c833830357-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 721.986323] env[59369]: INFO nova.compute.manager [None req-1267c56d-80e9-409a-9256-ceb9465e892f tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Terminating instance [ 721.988469] env[59369]: DEBUG nova.compute.manager [None req-1267c56d-80e9-409a-9256-ceb9465e892f tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Start destroying the instance on the hypervisor. {{(pid=59369) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 721.988916] env[59369]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-60e730b3-21d4-4b15-9e9a-aed04a2e4d64 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 721.997396] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-23fafce7-7ceb-4583-a254-5fc230eaf14e {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 722.026964] env[59369]: WARNING nova.virt.vmwareapi.driver [None req-1267c56d-80e9-409a-9256-ceb9465e892f tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Instance does not exists. Proceeding to delete instance properties on datastore: nova.exception.InstanceNotFound: Instance fe4378ed-29fd-4b4e-8943-80c833830357 could not be found. [ 722.026964] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-1267c56d-80e9-409a-9256-ceb9465e892f tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Destroying instance {{(pid=59369) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 722.026964] env[59369]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-ce72d367-c977-4f7a-bfe1-6993c0bfac4c {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 722.038188] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7a8c6809-3aaf-4ff0-a1c6-d58e7050d085 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 722.067784] env[59369]: WARNING nova.virt.vmwareapi.vmops [None req-1267c56d-80e9-409a-9256-ceb9465e892f tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance fe4378ed-29fd-4b4e-8943-80c833830357 could not be found. [ 722.068211] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-1267c56d-80e9-409a-9256-ceb9465e892f tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Instance destroyed {{(pid=59369) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 722.068507] env[59369]: INFO nova.compute.manager [None req-1267c56d-80e9-409a-9256-ceb9465e892f tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Took 0.08 seconds to destroy the instance on the hypervisor. [ 722.068768] env[59369]: DEBUG oslo.service.loopingcall [None req-1267c56d-80e9-409a-9256-ceb9465e892f tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59369) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 722.068970] env[59369]: DEBUG nova.compute.manager [-] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Deallocating network for instance {{(pid=59369) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 722.069081] env[59369]: DEBUG nova.network.neutron [-] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] deallocate_for_instance() {{(pid=59369) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 722.881890] env[59369]: DEBUG nova.network.neutron [-] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Updating instance_info_cache with network_info: [] {{(pid=59369) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 722.899133] env[59369]: INFO nova.compute.manager [-] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Took 0.83 seconds to deallocate network for instance. [ 722.907833] env[59369]: DEBUG nova.compute.manager [req-3ad8e2ed-0a90-4a58-a071-6ceea90cfad5 req-5fd04a7f-5c61-4da8-9226-209140d3ed15 service nova] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Received event network-vif-deleted-8c655868-3e30-4f41-8743-7288d0faa33c {{(pid=59369) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 722.955622] env[59369]: INFO nova.compute.manager [None req-1267c56d-80e9-409a-9256-ceb9465e892f tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Took 0.06 seconds to detach 1 volumes for instance. [ 722.958197] env[59369]: DEBUG nova.compute.manager [None req-1267c56d-80e9-409a-9256-ceb9465e892f tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] [instance: fe4378ed-29fd-4b4e-8943-80c833830357] Deleting volume: 3a1b2e8d-d5a5-493f-907f-3cfc470a56e3 {{(pid=59369) _cleanup_volumes /opt/stack/nova/nova/compute/manager.py:3217}} [ 723.030315] env[59369]: DEBUG oslo_concurrency.lockutils [None req-1267c56d-80e9-409a-9256-ceb9465e892f tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 723.030580] env[59369]: DEBUG oslo_concurrency.lockutils [None req-1267c56d-80e9-409a-9256-ceb9465e892f tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 723.030777] env[59369]: DEBUG nova.objects.instance [None req-1267c56d-80e9-409a-9256-ceb9465e892f tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Lazy-loading 'resources' on Instance uuid fe4378ed-29fd-4b4e-8943-80c833830357 {{(pid=59369) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1105}} [ 723.440474] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a8827f30-584e-4d61-a29c-88417c8b8a4d {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 723.449119] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0ecd7f53-0b3c-4dae-a165-cb1d97eab806 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 723.479772] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-33c214d0-f757-4439-bad2-81a2b4973a99 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 723.487279] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bb6a8ac7-3857-46d0-aab4-fd4ce296439c {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 723.500310] env[59369]: DEBUG nova.compute.provider_tree [None req-1267c56d-80e9-409a-9256-ceb9465e892f tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Inventory has not changed in ProviderTree for provider: 6ace6145-3535-4f74-aa29-80f64a201369 {{(pid=59369) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 723.509024] env[59369]: DEBUG nova.scheduler.client.report [None req-1267c56d-80e9-409a-9256-ceb9465e892f tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Inventory has not changed for provider 6ace6145-3535-4f74-aa29-80f64a201369 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 116, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59369) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 723.524447] env[59369]: DEBUG oslo_concurrency.lockutils [None req-1267c56d-80e9-409a-9256-ceb9465e892f tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.494s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 723.579905] env[59369]: DEBUG oslo_concurrency.lockutils [None req-1267c56d-80e9-409a-9256-ceb9465e892f tempest-ServerActionsV293TestJSON-588358398 tempest-ServerActionsV293TestJSON-588358398-project-member] Lock "fe4378ed-29fd-4b4e-8943-80c833830357" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.596s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 731.765073] env[59369]: DEBUG oslo_concurrency.lockutils [None req-0463cddb-ef9c-479c-b8b4-3f7d8b69261b tempest-AttachVolumeTestJSON-879001401 tempest-AttachVolumeTestJSON-879001401-project-member] Acquiring lock "dd191894-9cd2-4b87-9313-14dfe0636fbb" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 731.765342] env[59369]: DEBUG oslo_concurrency.lockutils [None req-0463cddb-ef9c-479c-b8b4-3f7d8b69261b tempest-AttachVolumeTestJSON-879001401 tempest-AttachVolumeTestJSON-879001401-project-member] Lock "dd191894-9cd2-4b87-9313-14dfe0636fbb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 735.257154] env[59369]: DEBUG oslo_service.periodic_task [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59369) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 735.257412] env[59369]: DEBUG oslo_service.periodic_task [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59369) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 736.257915] env[59369]: DEBUG oslo_service.periodic_task [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59369) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 736.257915] env[59369]: DEBUG nova.compute.manager [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Starting heal instance info cache {{(pid=59369) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9812}} [ 736.257915] env[59369]: DEBUG nova.compute.manager [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Rebuilding the list of instances to heal {{(pid=59369) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9816}} [ 736.277620] env[59369]: DEBUG nova.compute.manager [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] Skipping network cache update for instance because it is Building. {{(pid=59369) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 736.277773] env[59369]: DEBUG nova.compute.manager [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] [instance: eae87f89-8488-42e6-b065-1198bfbe8177] Skipping network cache update for instance because it is Building. {{(pid=59369) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 736.277900] env[59369]: DEBUG nova.compute.manager [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] [instance: d501202a-354c-42e1-8480-f026d5216a58] Skipping network cache update for instance because it is Building. {{(pid=59369) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 736.278043] env[59369]: DEBUG nova.compute.manager [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] [instance: ad662e06-1a0f-4110-8d23-8ff6c6889eee] Skipping network cache update for instance because it is Building. {{(pid=59369) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 736.278371] env[59369]: DEBUG nova.compute.manager [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] [instance: 3089e10b-f9fd-4049-b8f4-9297fe6a7c86] Skipping network cache update for instance because it is Building. {{(pid=59369) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 736.278371] env[59369]: DEBUG nova.compute.manager [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] [instance: 3117e247-0538-4a30-a0d7-aa47247a6da1] Skipping network cache update for instance because it is Building. {{(pid=59369) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 736.278520] env[59369]: DEBUG nova.compute.manager [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] [instance: f2673a5e-28b0-4a93-b93b-8eef64380e08] Skipping network cache update for instance because it is Building. {{(pid=59369) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 736.278520] env[59369]: DEBUG nova.compute.manager [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] [instance: 8960dfc9-2908-4113-bebd-3a045d3e135c] Skipping network cache update for instance because it is Building. {{(pid=59369) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 736.278624] env[59369]: DEBUG nova.compute.manager [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] [instance: ed81e73a-8787-4c22-842b-891d2ef116c7] Skipping network cache update for instance because it is Building. {{(pid=59369) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 736.278738] env[59369]: DEBUG nova.compute.manager [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] [instance: 4062c0d2-5018-4a4f-9bc6-2fbe57a92a16] Skipping network cache update for instance because it is Building. {{(pid=59369) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 736.278855] env[59369]: DEBUG nova.compute.manager [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Didn't find any instances for network info cache update. {{(pid=59369) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9898}} [ 736.279406] env[59369]: DEBUG oslo_service.periodic_task [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59369) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 736.279513] env[59369]: DEBUG nova.compute.manager [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59369) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10431}} [ 736.280312] env[59369]: DEBUG oslo_service.periodic_task [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59369) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 736.289113] env[59369]: DEBUG oslo_concurrency.lockutils [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 736.289113] env[59369]: DEBUG oslo_concurrency.lockutils [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 736.289113] env[59369]: DEBUG oslo_concurrency.lockutils [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 736.289216] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59369) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 736.290551] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ee8d8e8a-d820-435e-ae91-b30ca33b009b {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 736.299512] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-be7e29f8-9b0e-46e2-804b-d4b6581320de {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 736.312847] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1ae006de-1cea-4cf0-899f-bfe4320fba2b {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 736.319244] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-768e5d6f-e8dd-4521-8704-ab25c05c8390 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 736.348530] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181780MB free_disk=116GB free_vcpus=48 pci_devices=None {{(pid=59369) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 736.348703] env[59369]: DEBUG oslo_concurrency.lockutils [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 736.348862] env[59369]: DEBUG oslo_concurrency.lockutils [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 736.413185] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Instance 02929ad5-b2f4-4a80-a606-2a0c9b6222e4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59369) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 736.413345] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Instance eae87f89-8488-42e6-b065-1198bfbe8177 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59369) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 736.413466] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Instance d501202a-354c-42e1-8480-f026d5216a58 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59369) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 736.414839] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Instance ad662e06-1a0f-4110-8d23-8ff6c6889eee actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59369) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 736.414839] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Instance 3089e10b-f9fd-4049-b8f4-9297fe6a7c86 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59369) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 736.414839] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Instance 3117e247-0538-4a30-a0d7-aa47247a6da1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59369) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 736.414839] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Instance f2673a5e-28b0-4a93-b93b-8eef64380e08 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59369) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 736.414839] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Instance 8960dfc9-2908-4113-bebd-3a045d3e135c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59369) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 736.414839] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Instance ed81e73a-8787-4c22-842b-891d2ef116c7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59369) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 736.414839] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Instance 4062c0d2-5018-4a4f-9bc6-2fbe57a92a16 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59369) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 736.440902] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Instance f1815366-ecdb-4c46-a719-15f0ecdf5717 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59369) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 736.467372] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Instance f6b26955-b900-40d2-a910-57a4435c629c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59369) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 736.477984] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Instance 598a1045-cf7a-43e5-ae0d-438b246b6f7d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59369) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 736.487987] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Instance 2e0fe984-3c0c-40cb-95e0-0090f7b1b36c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59369) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 736.498041] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Instance c6e58407-630f-4c11-a810-2ffec58fc397 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59369) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 736.509328] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Instance 11a0c528-536f-4df5-906a-d4fa9fd581d4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59369) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 736.522439] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Instance fbc72ed9-2e72-4c09-bfaa-48d56df2edc0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59369) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 736.534014] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Instance a7c91b61-dc9b-4d79-b18f-4339b2248e04 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59369) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 736.542809] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Instance 35ba3bec-5278-496f-aa6c-a87861e1d8e9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59369) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 736.553544] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Instance 3106cf2f-1f97-4f1e-94a0-718f5fb735c8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59369) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 736.562827] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Instance 4de279b1-f754-4c7b-a53c-90824b17e766 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59369) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 736.572108] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Instance 83fad30b-6d18-40af-89fa-ccbc25dcc968 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59369) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 736.582881] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Instance 84c38578-ad32-4847-a6f1-94b2a288f89c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59369) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 736.592841] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Instance 1ce49896-dfcf-4091-a1d2-257babf2e2cd has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59369) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 736.603988] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Instance f6c3dea6-e885-448d-b711-5d1e430f0664 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59369) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 736.615455] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Instance ae493999-c5ad-4714-9ff9-228f4781a2ee has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59369) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 736.625805] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Instance c7b6f4c8-be99-47b6-9bb9-345ad738312a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59369) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 736.636576] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Instance dd191894-9cd2-4b87-9313-14dfe0636fbb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59369) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 736.636710] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=59369) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 736.636836] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=59369) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 736.952202] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e006ad10-5e16-428c-919c-90a913f6ebdd {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 736.959907] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2e87664d-ea4b-4235-bf7b-84c9be46b13e {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 736.988948] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6079758d-09bf-4353-b301-6dc3e1fe4b6e {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 736.995682] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-495c33b9-9bdf-406a-9253-cd35122c2bd5 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 737.010023] env[59369]: DEBUG nova.compute.provider_tree [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Inventory has not changed in ProviderTree for provider: 6ace6145-3535-4f74-aa29-80f64a201369 {{(pid=59369) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 737.017013] env[59369]: DEBUG nova.scheduler.client.report [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Inventory has not changed for provider 6ace6145-3535-4f74-aa29-80f64a201369 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 116, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59369) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 737.029907] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59369) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 737.030093] env[59369]: DEBUG oslo_concurrency.lockutils [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.681s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 738.008466] env[59369]: DEBUG oslo_service.periodic_task [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59369) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 738.008734] env[59369]: DEBUG oslo_service.periodic_task [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59369) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 738.008864] env[59369]: DEBUG oslo_service.periodic_task [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59369) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 739.257646] env[59369]: DEBUG oslo_service.periodic_task [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59369) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 754.075753] env[59369]: WARNING oslo_vmware.rw_handles [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 754.075753] env[59369]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 754.075753] env[59369]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 754.075753] env[59369]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 754.075753] env[59369]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 754.075753] env[59369]: ERROR oslo_vmware.rw_handles response.begin() [ 754.075753] env[59369]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 754.075753] env[59369]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 754.075753] env[59369]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 754.075753] env[59369]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 754.075753] env[59369]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 754.075753] env[59369]: ERROR oslo_vmware.rw_handles [ 754.076382] env[59369]: DEBUG nova.virt.vmwareapi.images [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] Downloaded image file data dc3bad91-06a4-485e-bdc4-b0135384e686 to vmware_temp/4c4e06b0-b2b9-4e89-b9a5-38be192f12a6/dc3bad91-06a4-485e-bdc4-b0135384e686/tmp-sparse.vmdk on the data store datastore1 {{(pid=59369) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 754.077784] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] Caching image {{(pid=59369) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 754.078044] env[59369]: DEBUG nova.virt.vmwareapi.vm_util [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] Copying Virtual Disk [datastore1] vmware_temp/4c4e06b0-b2b9-4e89-b9a5-38be192f12a6/dc3bad91-06a4-485e-bdc4-b0135384e686/tmp-sparse.vmdk to [datastore1] vmware_temp/4c4e06b0-b2b9-4e89-b9a5-38be192f12a6/dc3bad91-06a4-485e-bdc4-b0135384e686/dc3bad91-06a4-485e-bdc4-b0135384e686.vmdk {{(pid=59369) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 754.078826] env[59369]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-f375a705-7b51-4ca8-a4c9-0a9344e0ee5d {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 754.086605] env[59369]: DEBUG oslo_vmware.api [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] Waiting for the task: (returnval){ [ 754.086605] env[59369]: value = "task-463300" [ 754.086605] env[59369]: _type = "Task" [ 754.086605] env[59369]: } to complete. {{(pid=59369) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 754.094071] env[59369]: DEBUG oslo_vmware.api [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] Task: {'id': task-463300, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59369) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 754.596952] env[59369]: DEBUG oslo_vmware.exceptions [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] Fault InvalidArgument not matched. {{(pid=59369) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 754.597211] env[59369]: DEBUG oslo_concurrency.lockutils [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] Releasing lock "[datastore1] devstack-image-cache_base/dc3bad91-06a4-485e-bdc4-b0135384e686/dc3bad91-06a4-485e-bdc4-b0135384e686.vmdk" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 754.597801] env[59369]: ERROR nova.compute.manager [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 754.597801] env[59369]: Faults: ['InvalidArgument'] [ 754.597801] env[59369]: ERROR nova.compute.manager [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] Traceback (most recent call last): [ 754.597801] env[59369]: ERROR nova.compute.manager [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 754.597801] env[59369]: ERROR nova.compute.manager [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] yield resources [ 754.597801] env[59369]: ERROR nova.compute.manager [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 754.597801] env[59369]: ERROR nova.compute.manager [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] self.driver.spawn(context, instance, image_meta, [ 754.597801] env[59369]: ERROR nova.compute.manager [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 754.597801] env[59369]: ERROR nova.compute.manager [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] self._vmops.spawn(context, instance, image_meta, injected_files, [ 754.597801] env[59369]: ERROR nova.compute.manager [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 754.597801] env[59369]: ERROR nova.compute.manager [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] self._fetch_image_if_missing(context, vi) [ 754.597801] env[59369]: ERROR nova.compute.manager [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 754.597801] env[59369]: ERROR nova.compute.manager [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] image_cache(vi, tmp_image_ds_loc) [ 754.597801] env[59369]: ERROR nova.compute.manager [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 754.597801] env[59369]: ERROR nova.compute.manager [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] vm_util.copy_virtual_disk( [ 754.597801] env[59369]: ERROR nova.compute.manager [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 754.597801] env[59369]: ERROR nova.compute.manager [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] session._wait_for_task(vmdk_copy_task) [ 754.597801] env[59369]: ERROR nova.compute.manager [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 754.597801] env[59369]: ERROR nova.compute.manager [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] return self.wait_for_task(task_ref) [ 754.597801] env[59369]: ERROR nova.compute.manager [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 754.597801] env[59369]: ERROR nova.compute.manager [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] return evt.wait() [ 754.597801] env[59369]: ERROR nova.compute.manager [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 754.597801] env[59369]: ERROR nova.compute.manager [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] result = hub.switch() [ 754.597801] env[59369]: ERROR nova.compute.manager [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 754.597801] env[59369]: ERROR nova.compute.manager [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] return self.greenlet.switch() [ 754.597801] env[59369]: ERROR nova.compute.manager [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 754.597801] env[59369]: ERROR nova.compute.manager [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] self.f(*self.args, **self.kw) [ 754.597801] env[59369]: ERROR nova.compute.manager [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 754.597801] env[59369]: ERROR nova.compute.manager [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] raise exceptions.translate_fault(task_info.error) [ 754.597801] env[59369]: ERROR nova.compute.manager [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 754.597801] env[59369]: ERROR nova.compute.manager [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] Faults: ['InvalidArgument'] [ 754.597801] env[59369]: ERROR nova.compute.manager [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] [ 754.598655] env[59369]: INFO nova.compute.manager [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] Terminating instance [ 754.600076] env[59369]: DEBUG oslo_concurrency.lockutils [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] Acquired lock "[datastore1] devstack-image-cache_base/dc3bad91-06a4-485e-bdc4-b0135384e686/dc3bad91-06a4-485e-bdc4-b0135384e686.vmdk" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 754.600076] env[59369]: DEBUG nova.virt.vmwareapi.ds_util [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59369) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 754.600076] env[59369]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-331f3759-5a57-4c10-a753-35409c5f13db {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 754.602314] env[59369]: DEBUG nova.compute.manager [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] Start destroying the instance on the hypervisor. {{(pid=59369) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 754.602497] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] Destroying instance {{(pid=59369) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 754.603210] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a4673cb8-aa85-4bbb-9102-cfccb2170fad {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 754.609958] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] Unregistering the VM {{(pid=59369) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 754.610244] env[59369]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-be4ecefd-1546-4c95-ab89-e94a1c48d69f {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 754.612360] env[59369]: DEBUG nova.virt.vmwareapi.ds_util [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59369) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 754.612529] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59369) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 754.613534] env[59369]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-112152bf-82f8-4b34-84fb-1a4592b1b606 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 754.618400] env[59369]: DEBUG oslo_vmware.api [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] Waiting for the task: (returnval){ [ 754.618400] env[59369]: value = "session[52794835-9974-5e05-43d4-e917bd2adbc9]52cd8f40-fe68-3601-cf45-9142773b8608" [ 754.618400] env[59369]: _type = "Task" [ 754.618400] env[59369]: } to complete. {{(pid=59369) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 754.625230] env[59369]: DEBUG oslo_vmware.api [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] Task: {'id': session[52794835-9974-5e05-43d4-e917bd2adbc9]52cd8f40-fe68-3601-cf45-9142773b8608, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59369) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 754.697635] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] Unregistered the VM {{(pid=59369) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 754.697635] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] Deleting contents of the VM from datastore datastore1 {{(pid=59369) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 754.697635] env[59369]: DEBUG nova.virt.vmwareapi.ds_util [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] Deleting the datastore file [datastore1] 02929ad5-b2f4-4a80-a606-2a0c9b6222e4 {{(pid=59369) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 754.697934] env[59369]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-175d4758-cb3a-46c4-b85a-4f5cd5ea764c {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 754.704080] env[59369]: DEBUG oslo_vmware.api [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] Waiting for the task: (returnval){ [ 754.704080] env[59369]: value = "task-463302" [ 754.704080] env[59369]: _type = "Task" [ 754.704080] env[59369]: } to complete. {{(pid=59369) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 754.711686] env[59369]: DEBUG oslo_vmware.api [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] Task: {'id': task-463302, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59369) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 755.130101] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] [instance: eae87f89-8488-42e6-b065-1198bfbe8177] Preparing fetch location {{(pid=59369) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 755.130439] env[59369]: DEBUG nova.virt.vmwareapi.ds_util [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] Creating directory with path [datastore1] vmware_temp/cb86b5bb-8328-4cd5-8bbd-98ee15dda909/dc3bad91-06a4-485e-bdc4-b0135384e686 {{(pid=59369) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 755.130581] env[59369]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f74ed612-692e-4dc6-bc95-3917a9cfc424 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 755.142389] env[59369]: DEBUG nova.virt.vmwareapi.ds_util [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] Created directory with path [datastore1] vmware_temp/cb86b5bb-8328-4cd5-8bbd-98ee15dda909/dc3bad91-06a4-485e-bdc4-b0135384e686 {{(pid=59369) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 755.142632] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] [instance: eae87f89-8488-42e6-b065-1198bfbe8177] Fetch image to [datastore1] vmware_temp/cb86b5bb-8328-4cd5-8bbd-98ee15dda909/dc3bad91-06a4-485e-bdc4-b0135384e686/tmp-sparse.vmdk {{(pid=59369) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 755.142841] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] [instance: eae87f89-8488-42e6-b065-1198bfbe8177] Downloading image file data dc3bad91-06a4-485e-bdc4-b0135384e686 to [datastore1] vmware_temp/cb86b5bb-8328-4cd5-8bbd-98ee15dda909/dc3bad91-06a4-485e-bdc4-b0135384e686/tmp-sparse.vmdk on the data store datastore1 {{(pid=59369) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 755.145105] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0faf153f-13b0-4985-8e32-6874dedb7559 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 755.151178] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e0f553ef-8a51-4948-bac9-b1bb6a108f8f {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 755.160098] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4c5a22a9-e15f-4a27-8216-17d12f9faabc {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 755.191604] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-01888b30-7b63-4792-b419-fa2b5420aa47 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 755.197486] env[59369]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-ed58096e-f867-4ae5-86d8-742781875325 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 755.212736] env[59369]: DEBUG oslo_vmware.api [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] Task: {'id': task-463302, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.072525} completed successfully. {{(pid=59369) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 755.213033] env[59369]: DEBUG nova.virt.vmwareapi.ds_util [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] Deleted the datastore file {{(pid=59369) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 755.213250] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] Deleted contents of the VM from datastore datastore1 {{(pid=59369) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 755.213424] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] Instance destroyed {{(pid=59369) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 755.213590] env[59369]: INFO nova.compute.manager [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] Took 0.61 seconds to destroy the instance on the hypervisor. [ 755.216271] env[59369]: DEBUG nova.compute.claims [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] Aborting claim: {{(pid=59369) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 755.216532] env[59369]: DEBUG oslo_concurrency.lockutils [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 755.216856] env[59369]: DEBUG oslo_concurrency.lockutils [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 755.221451] env[59369]: DEBUG nova.virt.vmwareapi.images [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] [instance: eae87f89-8488-42e6-b065-1198bfbe8177] Downloading image file data dc3bad91-06a4-485e-bdc4-b0135384e686 to the data store datastore1 {{(pid=59369) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 755.270449] env[59369]: DEBUG oslo_vmware.rw_handles [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/cb86b5bb-8328-4cd5-8bbd-98ee15dda909/dc3bad91-06a4-485e-bdc4-b0135384e686/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59369) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 755.327933] env[59369]: DEBUG oslo_vmware.rw_handles [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] Completed reading data from the image iterator. {{(pid=59369) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 755.328105] env[59369]: DEBUG oslo_vmware.rw_handles [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/cb86b5bb-8328-4cd5-8bbd-98ee15dda909/dc3bad91-06a4-485e-bdc4-b0135384e686/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59369) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 755.623219] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-27b32457-29cd-4195-94b2-edb9dbdd1c19 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 755.630816] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-98f3928a-c812-4ae3-9fd2-55094d137a2c {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 755.659362] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a0cf552b-2b9d-443d-945b-2bc4f7ceb955 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 755.666394] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ba3c7bfc-f4b3-4ce0-bdae-1e3879bd4387 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 755.679734] env[59369]: DEBUG nova.compute.provider_tree [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] Inventory has not changed in ProviderTree for provider: 6ace6145-3535-4f74-aa29-80f64a201369 {{(pid=59369) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 755.687812] env[59369]: DEBUG nova.scheduler.client.report [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] Inventory has not changed for provider 6ace6145-3535-4f74-aa29-80f64a201369 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 116, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59369) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 755.704475] env[59369]: DEBUG oslo_concurrency.lockutils [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.487s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 755.704873] env[59369]: ERROR nova.compute.manager [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 755.704873] env[59369]: Faults: ['InvalidArgument'] [ 755.704873] env[59369]: ERROR nova.compute.manager [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] Traceback (most recent call last): [ 755.704873] env[59369]: ERROR nova.compute.manager [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 755.704873] env[59369]: ERROR nova.compute.manager [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] self.driver.spawn(context, instance, image_meta, [ 755.704873] env[59369]: ERROR nova.compute.manager [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 755.704873] env[59369]: ERROR nova.compute.manager [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] self._vmops.spawn(context, instance, image_meta, injected_files, [ 755.704873] env[59369]: ERROR nova.compute.manager [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 755.704873] env[59369]: ERROR nova.compute.manager [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] self._fetch_image_if_missing(context, vi) [ 755.704873] env[59369]: ERROR nova.compute.manager [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 755.704873] env[59369]: ERROR nova.compute.manager [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] image_cache(vi, tmp_image_ds_loc) [ 755.704873] env[59369]: ERROR nova.compute.manager [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 755.704873] env[59369]: ERROR nova.compute.manager [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] vm_util.copy_virtual_disk( [ 755.704873] env[59369]: ERROR nova.compute.manager [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 755.704873] env[59369]: ERROR nova.compute.manager [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] session._wait_for_task(vmdk_copy_task) [ 755.704873] env[59369]: ERROR nova.compute.manager [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 755.704873] env[59369]: ERROR nova.compute.manager [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] return self.wait_for_task(task_ref) [ 755.704873] env[59369]: ERROR nova.compute.manager [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 755.704873] env[59369]: ERROR nova.compute.manager [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] return evt.wait() [ 755.704873] env[59369]: ERROR nova.compute.manager [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 755.704873] env[59369]: ERROR nova.compute.manager [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] result = hub.switch() [ 755.704873] env[59369]: ERROR nova.compute.manager [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 755.704873] env[59369]: ERROR nova.compute.manager [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] return self.greenlet.switch() [ 755.704873] env[59369]: ERROR nova.compute.manager [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 755.704873] env[59369]: ERROR nova.compute.manager [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] self.f(*self.args, **self.kw) [ 755.704873] env[59369]: ERROR nova.compute.manager [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 755.704873] env[59369]: ERROR nova.compute.manager [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] raise exceptions.translate_fault(task_info.error) [ 755.704873] env[59369]: ERROR nova.compute.manager [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 755.704873] env[59369]: ERROR nova.compute.manager [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] Faults: ['InvalidArgument'] [ 755.704873] env[59369]: ERROR nova.compute.manager [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] [ 755.705760] env[59369]: DEBUG nova.compute.utils [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] VimFaultException {{(pid=59369) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 755.706905] env[59369]: DEBUG nova.compute.manager [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] Build of instance 02929ad5-b2f4-4a80-a606-2a0c9b6222e4 was re-scheduled: A specified parameter was not correct: fileType [ 755.706905] env[59369]: Faults: ['InvalidArgument'] {{(pid=59369) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 755.707289] env[59369]: DEBUG nova.compute.manager [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] Unplugging VIFs for instance {{(pid=59369) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 755.707455] env[59369]: DEBUG nova.compute.manager [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59369) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 755.707604] env[59369]: DEBUG nova.compute.manager [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] Deallocating network for instance {{(pid=59369) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 755.707757] env[59369]: DEBUG nova.network.neutron [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] deallocate_for_instance() {{(pid=59369) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 755.973486] env[59369]: DEBUG nova.network.neutron [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] Updating instance_info_cache with network_info: [] {{(pid=59369) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 755.989428] env[59369]: INFO nova.compute.manager [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] Took 0.28 seconds to deallocate network for instance. [ 756.095332] env[59369]: INFO nova.scheduler.client.report [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] Deleted allocations for instance 02929ad5-b2f4-4a80-a606-2a0c9b6222e4 [ 756.110442] env[59369]: DEBUG oslo_concurrency.lockutils [None req-a9ad5593-f398-4e24-bc5c-38b9005e7fb3 tempest-MigrationsAdminTest-1650282106 tempest-MigrationsAdminTest-1650282106-project-member] Lock "02929ad5-b2f4-4a80-a606-2a0c9b6222e4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 156.158s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 756.112404] env[59369]: DEBUG oslo_concurrency.lockutils [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Lock "02929ad5-b2f4-4a80-a606-2a0c9b6222e4" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 145.638s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 756.112404] env[59369]: INFO nova.compute.manager [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] [instance: 02929ad5-b2f4-4a80-a606-2a0c9b6222e4] During sync_power_state the instance has a pending task (spawning). Skip. [ 756.112404] env[59369]: DEBUG oslo_concurrency.lockutils [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Lock "02929ad5-b2f4-4a80-a606-2a0c9b6222e4" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 756.128017] env[59369]: DEBUG nova.compute.manager [None req-edb64f36-6ae4-4beb-b34c-bb6b27abdd85 tempest-MultipleCreateTestJSON-1252325267 tempest-MultipleCreateTestJSON-1252325267-project-member] [instance: f1815366-ecdb-4c46-a719-15f0ecdf5717] Starting instance... {{(pid=59369) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 756.183024] env[59369]: DEBUG oslo_concurrency.lockutils [None req-edb64f36-6ae4-4beb-b34c-bb6b27abdd85 tempest-MultipleCreateTestJSON-1252325267 tempest-MultipleCreateTestJSON-1252325267-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 756.183024] env[59369]: DEBUG oslo_concurrency.lockutils [None req-edb64f36-6ae4-4beb-b34c-bb6b27abdd85 tempest-MultipleCreateTestJSON-1252325267 tempest-MultipleCreateTestJSON-1252325267-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 756.183024] env[59369]: INFO nova.compute.claims [None req-edb64f36-6ae4-4beb-b34c-bb6b27abdd85 tempest-MultipleCreateTestJSON-1252325267 tempest-MultipleCreateTestJSON-1252325267-project-member] [instance: f1815366-ecdb-4c46-a719-15f0ecdf5717] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 756.597669] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bbc89020-0644-4666-84c9-8aff3f6fa831 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 756.608804] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-19a747ce-566c-4d9a-8079-89c0895a2511 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 756.661024] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7b913ba0-904f-4e67-a16a-b31afe3119d2 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 756.666382] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78fdeebd-6e85-4239-be13-f6d830d137af {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 756.679839] env[59369]: DEBUG nova.compute.provider_tree [None req-edb64f36-6ae4-4beb-b34c-bb6b27abdd85 tempest-MultipleCreateTestJSON-1252325267 tempest-MultipleCreateTestJSON-1252325267-project-member] Inventory has not changed in ProviderTree for provider: 6ace6145-3535-4f74-aa29-80f64a201369 {{(pid=59369) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 756.689153] env[59369]: DEBUG nova.scheduler.client.report [None req-edb64f36-6ae4-4beb-b34c-bb6b27abdd85 tempest-MultipleCreateTestJSON-1252325267 tempest-MultipleCreateTestJSON-1252325267-project-member] Inventory has not changed for provider 6ace6145-3535-4f74-aa29-80f64a201369 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 116, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59369) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 756.708464] env[59369]: DEBUG oslo_concurrency.lockutils [None req-edb64f36-6ae4-4beb-b34c-bb6b27abdd85 tempest-MultipleCreateTestJSON-1252325267 tempest-MultipleCreateTestJSON-1252325267-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.528s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 756.709023] env[59369]: DEBUG nova.compute.manager [None req-edb64f36-6ae4-4beb-b34c-bb6b27abdd85 tempest-MultipleCreateTestJSON-1252325267 tempest-MultipleCreateTestJSON-1252325267-project-member] [instance: f1815366-ecdb-4c46-a719-15f0ecdf5717] Start building networks asynchronously for instance. {{(pid=59369) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 756.744211] env[59369]: DEBUG nova.compute.utils [None req-edb64f36-6ae4-4beb-b34c-bb6b27abdd85 tempest-MultipleCreateTestJSON-1252325267 tempest-MultipleCreateTestJSON-1252325267-project-member] Using /dev/sd instead of None {{(pid=59369) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 756.746042] env[59369]: DEBUG nova.compute.manager [None req-edb64f36-6ae4-4beb-b34c-bb6b27abdd85 tempest-MultipleCreateTestJSON-1252325267 tempest-MultipleCreateTestJSON-1252325267-project-member] [instance: f1815366-ecdb-4c46-a719-15f0ecdf5717] Allocating IP information in the background. {{(pid=59369) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 756.746218] env[59369]: DEBUG nova.network.neutron [None req-edb64f36-6ae4-4beb-b34c-bb6b27abdd85 tempest-MultipleCreateTestJSON-1252325267 tempest-MultipleCreateTestJSON-1252325267-project-member] [instance: f1815366-ecdb-4c46-a719-15f0ecdf5717] allocate_for_instance() {{(pid=59369) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 756.755546] env[59369]: DEBUG nova.compute.manager [None req-edb64f36-6ae4-4beb-b34c-bb6b27abdd85 tempest-MultipleCreateTestJSON-1252325267 tempest-MultipleCreateTestJSON-1252325267-project-member] [instance: f1815366-ecdb-4c46-a719-15f0ecdf5717] Start building block device mappings for instance. {{(pid=59369) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 756.813058] env[59369]: DEBUG nova.policy [None req-edb64f36-6ae4-4beb-b34c-bb6b27abdd85 tempest-MultipleCreateTestJSON-1252325267 tempest-MultipleCreateTestJSON-1252325267-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4b07911176524c0f8975c1e0ff4b8f38', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2b4b0f9568c94223acebbfc06a2c63b5', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59369) authorize /opt/stack/nova/nova/policy.py:203}} [ 756.864081] env[59369]: DEBUG nova.compute.manager [None req-edb64f36-6ae4-4beb-b34c-bb6b27abdd85 tempest-MultipleCreateTestJSON-1252325267 tempest-MultipleCreateTestJSON-1252325267-project-member] [instance: f1815366-ecdb-4c46-a719-15f0ecdf5717] Start spawning the instance on the hypervisor. {{(pid=59369) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 756.887364] env[59369]: DEBUG nova.virt.hardware [None req-edb64f36-6ae4-4beb-b34c-bb6b27abdd85 tempest-MultipleCreateTestJSON-1252325267 tempest-MultipleCreateTestJSON-1252325267-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-08-29T11:50:58Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-08-29T11:50:41Z,direct_url=,disk_format='vmdk',id=dc3bad91-06a4-485e-bdc4-b0135384e686,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='7386534ab8334806bb50f77e095e084c',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-08-29T11:50:42Z,virtual_size=,visibility=), allow threads: False {{(pid=59369) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 756.887621] env[59369]: DEBUG nova.virt.hardware [None req-edb64f36-6ae4-4beb-b34c-bb6b27abdd85 tempest-MultipleCreateTestJSON-1252325267 tempest-MultipleCreateTestJSON-1252325267-project-member] Flavor limits 0:0:0 {{(pid=59369) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 756.887772] env[59369]: DEBUG nova.virt.hardware [None req-edb64f36-6ae4-4beb-b34c-bb6b27abdd85 tempest-MultipleCreateTestJSON-1252325267 tempest-MultipleCreateTestJSON-1252325267-project-member] Image limits 0:0:0 {{(pid=59369) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 756.887947] env[59369]: DEBUG nova.virt.hardware [None req-edb64f36-6ae4-4beb-b34c-bb6b27abdd85 tempest-MultipleCreateTestJSON-1252325267 tempest-MultipleCreateTestJSON-1252325267-project-member] Flavor pref 0:0:0 {{(pid=59369) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 756.888296] env[59369]: DEBUG nova.virt.hardware [None req-edb64f36-6ae4-4beb-b34c-bb6b27abdd85 tempest-MultipleCreateTestJSON-1252325267 tempest-MultipleCreateTestJSON-1252325267-project-member] Image pref 0:0:0 {{(pid=59369) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 756.888296] env[59369]: DEBUG nova.virt.hardware [None req-edb64f36-6ae4-4beb-b34c-bb6b27abdd85 tempest-MultipleCreateTestJSON-1252325267 tempest-MultipleCreateTestJSON-1252325267-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59369) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 756.888434] env[59369]: DEBUG nova.virt.hardware [None req-edb64f36-6ae4-4beb-b34c-bb6b27abdd85 tempest-MultipleCreateTestJSON-1252325267 tempest-MultipleCreateTestJSON-1252325267-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59369) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 756.888587] env[59369]: DEBUG nova.virt.hardware [None req-edb64f36-6ae4-4beb-b34c-bb6b27abdd85 tempest-MultipleCreateTestJSON-1252325267 tempest-MultipleCreateTestJSON-1252325267-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59369) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 756.888750] env[59369]: DEBUG nova.virt.hardware [None req-edb64f36-6ae4-4beb-b34c-bb6b27abdd85 tempest-MultipleCreateTestJSON-1252325267 tempest-MultipleCreateTestJSON-1252325267-project-member] Got 1 possible topologies {{(pid=59369) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 756.888911] env[59369]: DEBUG nova.virt.hardware [None req-edb64f36-6ae4-4beb-b34c-bb6b27abdd85 tempest-MultipleCreateTestJSON-1252325267 tempest-MultipleCreateTestJSON-1252325267-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59369) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 756.889090] env[59369]: DEBUG nova.virt.hardware [None req-edb64f36-6ae4-4beb-b34c-bb6b27abdd85 tempest-MultipleCreateTestJSON-1252325267 tempest-MultipleCreateTestJSON-1252325267-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59369) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 756.889989] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9df4922b-27bc-4c62-9ee7-47f505fce9f1 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 756.900026] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8e55a1ae-745d-49e4-a3b3-85f31f414818 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 757.225829] env[59369]: DEBUG nova.network.neutron [None req-edb64f36-6ae4-4beb-b34c-bb6b27abdd85 tempest-MultipleCreateTestJSON-1252325267 tempest-MultipleCreateTestJSON-1252325267-project-member] [instance: f1815366-ecdb-4c46-a719-15f0ecdf5717] Successfully created port: 89736c0d-9c39-4076-af2a-59dd9adda150 {{(pid=59369) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 758.141172] env[59369]: DEBUG nova.compute.manager [req-c74ba3da-8e6a-491b-b2f8-d2a20bbd94af req-d0aab9ff-4e49-4925-9eb0-d55917cd11af service nova] [instance: f1815366-ecdb-4c46-a719-15f0ecdf5717] Received event network-vif-plugged-89736c0d-9c39-4076-af2a-59dd9adda150 {{(pid=59369) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 758.141395] env[59369]: DEBUG oslo_concurrency.lockutils [req-c74ba3da-8e6a-491b-b2f8-d2a20bbd94af req-d0aab9ff-4e49-4925-9eb0-d55917cd11af service nova] Acquiring lock "f1815366-ecdb-4c46-a719-15f0ecdf5717-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 758.141596] env[59369]: DEBUG oslo_concurrency.lockutils [req-c74ba3da-8e6a-491b-b2f8-d2a20bbd94af req-d0aab9ff-4e49-4925-9eb0-d55917cd11af service nova] Lock "f1815366-ecdb-4c46-a719-15f0ecdf5717-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 758.141756] env[59369]: DEBUG oslo_concurrency.lockutils [req-c74ba3da-8e6a-491b-b2f8-d2a20bbd94af req-d0aab9ff-4e49-4925-9eb0-d55917cd11af service nova] Lock "f1815366-ecdb-4c46-a719-15f0ecdf5717-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 758.141910] env[59369]: DEBUG nova.compute.manager [req-c74ba3da-8e6a-491b-b2f8-d2a20bbd94af req-d0aab9ff-4e49-4925-9eb0-d55917cd11af service nova] [instance: f1815366-ecdb-4c46-a719-15f0ecdf5717] No waiting events found dispatching network-vif-plugged-89736c0d-9c39-4076-af2a-59dd9adda150 {{(pid=59369) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 758.142379] env[59369]: WARNING nova.compute.manager [req-c74ba3da-8e6a-491b-b2f8-d2a20bbd94af req-d0aab9ff-4e49-4925-9eb0-d55917cd11af service nova] [instance: f1815366-ecdb-4c46-a719-15f0ecdf5717] Received unexpected event network-vif-plugged-89736c0d-9c39-4076-af2a-59dd9adda150 for instance with vm_state building and task_state spawning. [ 758.225351] env[59369]: DEBUG nova.network.neutron [None req-edb64f36-6ae4-4beb-b34c-bb6b27abdd85 tempest-MultipleCreateTestJSON-1252325267 tempest-MultipleCreateTestJSON-1252325267-project-member] [instance: f1815366-ecdb-4c46-a719-15f0ecdf5717] Successfully updated port: 89736c0d-9c39-4076-af2a-59dd9adda150 {{(pid=59369) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 758.235665] env[59369]: DEBUG oslo_concurrency.lockutils [None req-edb64f36-6ae4-4beb-b34c-bb6b27abdd85 tempest-MultipleCreateTestJSON-1252325267 tempest-MultipleCreateTestJSON-1252325267-project-member] Acquiring lock "refresh_cache-f1815366-ecdb-4c46-a719-15f0ecdf5717" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 758.235914] env[59369]: DEBUG oslo_concurrency.lockutils [None req-edb64f36-6ae4-4beb-b34c-bb6b27abdd85 tempest-MultipleCreateTestJSON-1252325267 tempest-MultipleCreateTestJSON-1252325267-project-member] Acquired lock "refresh_cache-f1815366-ecdb-4c46-a719-15f0ecdf5717" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 758.235947] env[59369]: DEBUG nova.network.neutron [None req-edb64f36-6ae4-4beb-b34c-bb6b27abdd85 tempest-MultipleCreateTestJSON-1252325267 tempest-MultipleCreateTestJSON-1252325267-project-member] [instance: f1815366-ecdb-4c46-a719-15f0ecdf5717] Building network info cache for instance {{(pid=59369) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 758.272066] env[59369]: DEBUG nova.network.neutron [None req-edb64f36-6ae4-4beb-b34c-bb6b27abdd85 tempest-MultipleCreateTestJSON-1252325267 tempest-MultipleCreateTestJSON-1252325267-project-member] [instance: f1815366-ecdb-4c46-a719-15f0ecdf5717] Instance cache missing network info. {{(pid=59369) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 758.471379] env[59369]: DEBUG nova.network.neutron [None req-edb64f36-6ae4-4beb-b34c-bb6b27abdd85 tempest-MultipleCreateTestJSON-1252325267 tempest-MultipleCreateTestJSON-1252325267-project-member] [instance: f1815366-ecdb-4c46-a719-15f0ecdf5717] Updating instance_info_cache with network_info: [{"id": "89736c0d-9c39-4076-af2a-59dd9adda150", "address": "fa:16:3e:7b:ef:62", "network": {"id": "df0bb3c8-6412-4a02-8c1e-df873e98101f", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-2124519556-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "2b4b0f9568c94223acebbfc06a2c63b5", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c68b7663-4f0e-47f0-ac7f-40c6d952f7bb", "external-id": "nsx-vlan-transportzone-696", "segmentation_id": 696, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap89736c0d-9c", "ovs_interfaceid": "89736c0d-9c39-4076-af2a-59dd9adda150", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59369) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 758.487993] env[59369]: DEBUG oslo_concurrency.lockutils [None req-edb64f36-6ae4-4beb-b34c-bb6b27abdd85 tempest-MultipleCreateTestJSON-1252325267 tempest-MultipleCreateTestJSON-1252325267-project-member] Releasing lock "refresh_cache-f1815366-ecdb-4c46-a719-15f0ecdf5717" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 758.488328] env[59369]: DEBUG nova.compute.manager [None req-edb64f36-6ae4-4beb-b34c-bb6b27abdd85 tempest-MultipleCreateTestJSON-1252325267 tempest-MultipleCreateTestJSON-1252325267-project-member] [instance: f1815366-ecdb-4c46-a719-15f0ecdf5717] Instance network_info: |[{"id": "89736c0d-9c39-4076-af2a-59dd9adda150", "address": "fa:16:3e:7b:ef:62", "network": {"id": "df0bb3c8-6412-4a02-8c1e-df873e98101f", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-2124519556-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "2b4b0f9568c94223acebbfc06a2c63b5", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c68b7663-4f0e-47f0-ac7f-40c6d952f7bb", "external-id": "nsx-vlan-transportzone-696", "segmentation_id": 696, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap89736c0d-9c", "ovs_interfaceid": "89736c0d-9c39-4076-af2a-59dd9adda150", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59369) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 758.488697] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-edb64f36-6ae4-4beb-b34c-bb6b27abdd85 tempest-MultipleCreateTestJSON-1252325267 tempest-MultipleCreateTestJSON-1252325267-project-member] [instance: f1815366-ecdb-4c46-a719-15f0ecdf5717] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:7b:ef:62', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'c68b7663-4f0e-47f0-ac7f-40c6d952f7bb', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '89736c0d-9c39-4076-af2a-59dd9adda150', 'vif_model': 'vmxnet3'}] {{(pid=59369) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 758.496969] env[59369]: DEBUG nova.virt.vmwareapi.vm_util [None req-edb64f36-6ae4-4beb-b34c-bb6b27abdd85 tempest-MultipleCreateTestJSON-1252325267 tempest-MultipleCreateTestJSON-1252325267-project-member] Creating folder: Project (2b4b0f9568c94223acebbfc06a2c63b5). Parent ref: group-v121837. {{(pid=59369) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 758.496969] env[59369]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c985cea3-42ed-45bc-a97d-af5c6eee4933 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 758.507712] env[59369]: INFO nova.virt.vmwareapi.vm_util [None req-edb64f36-6ae4-4beb-b34c-bb6b27abdd85 tempest-MultipleCreateTestJSON-1252325267 tempest-MultipleCreateTestJSON-1252325267-project-member] Created folder: Project (2b4b0f9568c94223acebbfc06a2c63b5) in parent group-v121837. [ 758.507895] env[59369]: DEBUG nova.virt.vmwareapi.vm_util [None req-edb64f36-6ae4-4beb-b34c-bb6b27abdd85 tempest-MultipleCreateTestJSON-1252325267 tempest-MultipleCreateTestJSON-1252325267-project-member] Creating folder: Instances. Parent ref: group-v121884. {{(pid=59369) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 758.508168] env[59369]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-9fd37939-ab68-43b1-a457-1b5cb9531eb3 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 758.517789] env[59369]: INFO nova.virt.vmwareapi.vm_util [None req-edb64f36-6ae4-4beb-b34c-bb6b27abdd85 tempest-MultipleCreateTestJSON-1252325267 tempest-MultipleCreateTestJSON-1252325267-project-member] Created folder: Instances in parent group-v121884. [ 758.517789] env[59369]: DEBUG oslo.service.loopingcall [None req-edb64f36-6ae4-4beb-b34c-bb6b27abdd85 tempest-MultipleCreateTestJSON-1252325267 tempest-MultipleCreateTestJSON-1252325267-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59369) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 758.517789] env[59369]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f1815366-ecdb-4c46-a719-15f0ecdf5717] Creating VM on the ESX host {{(pid=59369) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 758.517968] env[59369]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-a3c8a51a-cecf-4693-b174-c1627a574085 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 758.537478] env[59369]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 758.537478] env[59369]: value = "task-463305" [ 758.537478] env[59369]: _type = "Task" [ 758.537478] env[59369]: } to complete. {{(pid=59369) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 758.547527] env[59369]: DEBUG oslo_vmware.api [-] Task: {'id': task-463305, 'name': CreateVM_Task} progress is 0%. {{(pid=59369) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 759.047265] env[59369]: DEBUG oslo_vmware.api [-] Task: {'id': task-463305, 'name': CreateVM_Task, 'duration_secs': 0.28835} completed successfully. {{(pid=59369) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 759.047426] env[59369]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f1815366-ecdb-4c46-a719-15f0ecdf5717] Created VM on the ESX host {{(pid=59369) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 759.048085] env[59369]: DEBUG oslo_concurrency.lockutils [None req-edb64f36-6ae4-4beb-b34c-bb6b27abdd85 tempest-MultipleCreateTestJSON-1252325267 tempest-MultipleCreateTestJSON-1252325267-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/dc3bad91-06a4-485e-bdc4-b0135384e686" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 759.048242] env[59369]: DEBUG oslo_concurrency.lockutils [None req-edb64f36-6ae4-4beb-b34c-bb6b27abdd85 tempest-MultipleCreateTestJSON-1252325267 tempest-MultipleCreateTestJSON-1252325267-project-member] Acquired lock "[datastore1] devstack-image-cache_base/dc3bad91-06a4-485e-bdc4-b0135384e686" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 759.048543] env[59369]: DEBUG oslo_concurrency.lockutils [None req-edb64f36-6ae4-4beb-b34c-bb6b27abdd85 tempest-MultipleCreateTestJSON-1252325267 tempest-MultipleCreateTestJSON-1252325267-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/dc3bad91-06a4-485e-bdc4-b0135384e686" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 759.048804] env[59369]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-8e1d44e1-ace8-4dba-a90b-96aa87d5ecb5 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 759.054431] env[59369]: DEBUG oslo_vmware.api [None req-edb64f36-6ae4-4beb-b34c-bb6b27abdd85 tempest-MultipleCreateTestJSON-1252325267 tempest-MultipleCreateTestJSON-1252325267-project-member] Waiting for the task: (returnval){ [ 759.054431] env[59369]: value = "session[52794835-9974-5e05-43d4-e917bd2adbc9]523da79e-ed09-cf10-00a9-315270f69bea" [ 759.054431] env[59369]: _type = "Task" [ 759.054431] env[59369]: } to complete. {{(pid=59369) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 759.064089] env[59369]: DEBUG oslo_vmware.api [None req-edb64f36-6ae4-4beb-b34c-bb6b27abdd85 tempest-MultipleCreateTestJSON-1252325267 tempest-MultipleCreateTestJSON-1252325267-project-member] Task: {'id': session[52794835-9974-5e05-43d4-e917bd2adbc9]523da79e-ed09-cf10-00a9-315270f69bea, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59369) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 759.564384] env[59369]: DEBUG oslo_concurrency.lockutils [None req-edb64f36-6ae4-4beb-b34c-bb6b27abdd85 tempest-MultipleCreateTestJSON-1252325267 tempest-MultipleCreateTestJSON-1252325267-project-member] Releasing lock "[datastore1] devstack-image-cache_base/dc3bad91-06a4-485e-bdc4-b0135384e686" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 759.564686] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-edb64f36-6ae4-4beb-b34c-bb6b27abdd85 tempest-MultipleCreateTestJSON-1252325267 tempest-MultipleCreateTestJSON-1252325267-project-member] [instance: f1815366-ecdb-4c46-a719-15f0ecdf5717] Processing image dc3bad91-06a4-485e-bdc4-b0135384e686 {{(pid=59369) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 759.564819] env[59369]: DEBUG oslo_concurrency.lockutils [None req-edb64f36-6ae4-4beb-b34c-bb6b27abdd85 tempest-MultipleCreateTestJSON-1252325267 tempest-MultipleCreateTestJSON-1252325267-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/dc3bad91-06a4-485e-bdc4-b0135384e686/dc3bad91-06a4-485e-bdc4-b0135384e686.vmdk" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 760.176643] env[59369]: DEBUG nova.compute.manager [req-930a4654-a196-4b01-937c-3222753fdda1 req-d80db556-3c2f-4721-b3ca-49240baaea88 service nova] [instance: f1815366-ecdb-4c46-a719-15f0ecdf5717] Received event network-changed-89736c0d-9c39-4076-af2a-59dd9adda150 {{(pid=59369) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 760.176859] env[59369]: DEBUG nova.compute.manager [req-930a4654-a196-4b01-937c-3222753fdda1 req-d80db556-3c2f-4721-b3ca-49240baaea88 service nova] [instance: f1815366-ecdb-4c46-a719-15f0ecdf5717] Refreshing instance network info cache due to event network-changed-89736c0d-9c39-4076-af2a-59dd9adda150. {{(pid=59369) external_instance_event /opt/stack/nova/nova/compute/manager.py:11007}} [ 760.177034] env[59369]: DEBUG oslo_concurrency.lockutils [req-930a4654-a196-4b01-937c-3222753fdda1 req-d80db556-3c2f-4721-b3ca-49240baaea88 service nova] Acquiring lock "refresh_cache-f1815366-ecdb-4c46-a719-15f0ecdf5717" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 760.177208] env[59369]: DEBUG oslo_concurrency.lockutils [req-930a4654-a196-4b01-937c-3222753fdda1 req-d80db556-3c2f-4721-b3ca-49240baaea88 service nova] Acquired lock "refresh_cache-f1815366-ecdb-4c46-a719-15f0ecdf5717" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 760.177305] env[59369]: DEBUG nova.network.neutron [req-930a4654-a196-4b01-937c-3222753fdda1 req-d80db556-3c2f-4721-b3ca-49240baaea88 service nova] [instance: f1815366-ecdb-4c46-a719-15f0ecdf5717] Refreshing network info cache for port 89736c0d-9c39-4076-af2a-59dd9adda150 {{(pid=59369) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 760.558510] env[59369]: DEBUG nova.network.neutron [req-930a4654-a196-4b01-937c-3222753fdda1 req-d80db556-3c2f-4721-b3ca-49240baaea88 service nova] [instance: f1815366-ecdb-4c46-a719-15f0ecdf5717] Updated VIF entry in instance network info cache for port 89736c0d-9c39-4076-af2a-59dd9adda150. {{(pid=59369) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 760.558510] env[59369]: DEBUG nova.network.neutron [req-930a4654-a196-4b01-937c-3222753fdda1 req-d80db556-3c2f-4721-b3ca-49240baaea88 service nova] [instance: f1815366-ecdb-4c46-a719-15f0ecdf5717] Updating instance_info_cache with network_info: [{"id": "89736c0d-9c39-4076-af2a-59dd9adda150", "address": "fa:16:3e:7b:ef:62", "network": {"id": "df0bb3c8-6412-4a02-8c1e-df873e98101f", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-2124519556-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "2b4b0f9568c94223acebbfc06a2c63b5", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c68b7663-4f0e-47f0-ac7f-40c6d952f7bb", "external-id": "nsx-vlan-transportzone-696", "segmentation_id": 696, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap89736c0d-9c", "ovs_interfaceid": "89736c0d-9c39-4076-af2a-59dd9adda150", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59369) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 760.569623] env[59369]: DEBUG oslo_concurrency.lockutils [req-930a4654-a196-4b01-937c-3222753fdda1 req-d80db556-3c2f-4721-b3ca-49240baaea88 service nova] Releasing lock "refresh_cache-f1815366-ecdb-4c46-a719-15f0ecdf5717" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 796.257207] env[59369]: DEBUG oslo_service.periodic_task [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59369) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 796.257540] env[59369]: DEBUG oslo_service.periodic_task [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59369) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 796.257702] env[59369]: DEBUG nova.compute.manager [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59369) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10431}} [ 797.258217] env[59369]: DEBUG oslo_service.periodic_task [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59369) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 798.252695] env[59369]: DEBUG oslo_service.periodic_task [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59369) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 798.252810] env[59369]: DEBUG oslo_service.periodic_task [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=59369) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 798.277871] env[59369]: DEBUG oslo_service.periodic_task [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59369) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 798.278094] env[59369]: DEBUG nova.compute.manager [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Starting heal instance info cache {{(pid=59369) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9812}} [ 798.279057] env[59369]: DEBUG nova.compute.manager [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Rebuilding the list of instances to heal {{(pid=59369) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9816}} [ 798.300042] env[59369]: DEBUG nova.compute.manager [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] [instance: eae87f89-8488-42e6-b065-1198bfbe8177] Skipping network cache update for instance because it is Building. {{(pid=59369) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 798.300316] env[59369]: DEBUG nova.compute.manager [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] [instance: d501202a-354c-42e1-8480-f026d5216a58] Skipping network cache update for instance because it is Building. {{(pid=59369) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 798.300545] env[59369]: DEBUG nova.compute.manager [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] [instance: ad662e06-1a0f-4110-8d23-8ff6c6889eee] Skipping network cache update for instance because it is Building. {{(pid=59369) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 798.300766] env[59369]: DEBUG nova.compute.manager [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] [instance: 3089e10b-f9fd-4049-b8f4-9297fe6a7c86] Skipping network cache update for instance because it is Building. {{(pid=59369) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 798.300975] env[59369]: DEBUG nova.compute.manager [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] [instance: 3117e247-0538-4a30-a0d7-aa47247a6da1] Skipping network cache update for instance because it is Building. {{(pid=59369) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 798.301234] env[59369]: DEBUG nova.compute.manager [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] [instance: f2673a5e-28b0-4a93-b93b-8eef64380e08] Skipping network cache update for instance because it is Building. {{(pid=59369) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 798.301418] env[59369]: DEBUG nova.compute.manager [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] [instance: 8960dfc9-2908-4113-bebd-3a045d3e135c] Skipping network cache update for instance because it is Building. {{(pid=59369) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 798.301631] env[59369]: DEBUG nova.compute.manager [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] [instance: ed81e73a-8787-4c22-842b-891d2ef116c7] Skipping network cache update for instance because it is Building. {{(pid=59369) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 798.301846] env[59369]: DEBUG nova.compute.manager [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] [instance: 4062c0d2-5018-4a4f-9bc6-2fbe57a92a16] Skipping network cache update for instance because it is Building. {{(pid=59369) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 798.302072] env[59369]: DEBUG nova.compute.manager [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] [instance: f1815366-ecdb-4c46-a719-15f0ecdf5717] Skipping network cache update for instance because it is Building. {{(pid=59369) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 798.302429] env[59369]: DEBUG nova.compute.manager [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Didn't find any instances for network info cache update. {{(pid=59369) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9898}} [ 798.303063] env[59369]: DEBUG oslo_service.periodic_task [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59369) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 798.303438] env[59369]: DEBUG oslo_service.periodic_task [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59369) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 798.303695] env[59369]: DEBUG oslo_service.periodic_task [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59369) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 798.314343] env[59369]: DEBUG oslo_concurrency.lockutils [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 798.314722] env[59369]: DEBUG oslo_concurrency.lockutils [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 798.315017] env[59369]: DEBUG oslo_concurrency.lockutils [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 798.315303] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59369) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 798.317398] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a3c80203-6969-4c37-8704-f84c57336674 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 798.329350] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-198b99fb-b777-4a72-9470-33011246f968 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 798.348328] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-509a9eb9-70d9-49eb-a849-8b5371a0199e {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 798.355366] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-972b0f40-263a-47cb-a4a0-53a5cc125706 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 798.386161] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181804MB free_disk=116GB free_vcpus=48 pci_devices=None {{(pid=59369) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 798.386285] env[59369]: DEBUG oslo_concurrency.lockutils [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 798.386472] env[59369]: DEBUG oslo_concurrency.lockutils [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 798.453865] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Instance eae87f89-8488-42e6-b065-1198bfbe8177 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59369) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 798.454031] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Instance d501202a-354c-42e1-8480-f026d5216a58 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59369) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 798.454168] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Instance ad662e06-1a0f-4110-8d23-8ff6c6889eee actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59369) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 798.454337] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Instance 3089e10b-f9fd-4049-b8f4-9297fe6a7c86 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59369) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 798.454468] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Instance 3117e247-0538-4a30-a0d7-aa47247a6da1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59369) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 798.454587] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Instance f2673a5e-28b0-4a93-b93b-8eef64380e08 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59369) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 798.454701] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Instance 8960dfc9-2908-4113-bebd-3a045d3e135c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59369) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 798.454814] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Instance ed81e73a-8787-4c22-842b-891d2ef116c7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59369) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 798.454915] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Instance 4062c0d2-5018-4a4f-9bc6-2fbe57a92a16 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59369) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 798.455046] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Instance f1815366-ecdb-4c46-a719-15f0ecdf5717 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59369) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 798.466296] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Instance f6b26955-b900-40d2-a910-57a4435c629c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59369) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 798.478238] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Instance 598a1045-cf7a-43e5-ae0d-438b246b6f7d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59369) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 798.488930] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Instance 2e0fe984-3c0c-40cb-95e0-0090f7b1b36c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59369) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 798.499465] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Instance c6e58407-630f-4c11-a810-2ffec58fc397 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59369) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 798.509055] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Instance 11a0c528-536f-4df5-906a-d4fa9fd581d4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59369) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 798.517983] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Instance fbc72ed9-2e72-4c09-bfaa-48d56df2edc0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59369) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 798.527544] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Instance a7c91b61-dc9b-4d79-b18f-4339b2248e04 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59369) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 798.538548] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Instance 35ba3bec-5278-496f-aa6c-a87861e1d8e9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59369) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 798.547875] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Instance 3106cf2f-1f97-4f1e-94a0-718f5fb735c8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59369) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 798.556524] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Instance 4de279b1-f754-4c7b-a53c-90824b17e766 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59369) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 798.566179] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Instance 83fad30b-6d18-40af-89fa-ccbc25dcc968 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59369) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 798.575324] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Instance 84c38578-ad32-4847-a6f1-94b2a288f89c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59369) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 798.583801] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Instance 1ce49896-dfcf-4091-a1d2-257babf2e2cd has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59369) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 798.592225] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Instance f6c3dea6-e885-448d-b711-5d1e430f0664 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59369) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 798.601796] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Instance ae493999-c5ad-4714-9ff9-228f4781a2ee has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59369) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 798.610675] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Instance c7b6f4c8-be99-47b6-9bb9-345ad738312a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59369) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 798.619571] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Instance dd191894-9cd2-4b87-9313-14dfe0636fbb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59369) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 798.619814] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=59369) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 798.619958] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=59369) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 798.926041] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5e4f2d8a-87d3-4166-bfab-b2fa547c28da {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 798.934869] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-773b1480-6046-446f-b8ad-ed7ec0703a4a {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 798.964312] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6c0253e5-3c34-4d71-8b00-06c2cc72d7cc {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 798.970961] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2a2a3faf-7b7c-41d7-b14b-b3f305412973 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 798.983809] env[59369]: DEBUG nova.compute.provider_tree [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Inventory has not changed in ProviderTree for provider: 6ace6145-3535-4f74-aa29-80f64a201369 {{(pid=59369) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 798.991812] env[59369]: DEBUG nova.scheduler.client.report [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Inventory has not changed for provider 6ace6145-3535-4f74-aa29-80f64a201369 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 116, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59369) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 799.007665] env[59369]: DEBUG nova.compute.resource_tracker [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59369) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 799.007858] env[59369]: DEBUG oslo_concurrency.lockutils [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.621s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 801.962681] env[59369]: DEBUG oslo_service.periodic_task [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59369) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 804.094610] env[59369]: WARNING oslo_vmware.rw_handles [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 804.094610] env[59369]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 804.094610] env[59369]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 804.094610] env[59369]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 804.094610] env[59369]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 804.094610] env[59369]: ERROR oslo_vmware.rw_handles response.begin() [ 804.094610] env[59369]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 804.094610] env[59369]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 804.094610] env[59369]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 804.094610] env[59369]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 804.094610] env[59369]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 804.094610] env[59369]: ERROR oslo_vmware.rw_handles [ 804.095414] env[59369]: DEBUG nova.virt.vmwareapi.images [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] [instance: eae87f89-8488-42e6-b065-1198bfbe8177] Downloaded image file data dc3bad91-06a4-485e-bdc4-b0135384e686 to vmware_temp/cb86b5bb-8328-4cd5-8bbd-98ee15dda909/dc3bad91-06a4-485e-bdc4-b0135384e686/tmp-sparse.vmdk on the data store datastore1 {{(pid=59369) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 804.096684] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] [instance: eae87f89-8488-42e6-b065-1198bfbe8177] Caching image {{(pid=59369) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 804.096920] env[59369]: DEBUG nova.virt.vmwareapi.vm_util [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] Copying Virtual Disk [datastore1] vmware_temp/cb86b5bb-8328-4cd5-8bbd-98ee15dda909/dc3bad91-06a4-485e-bdc4-b0135384e686/tmp-sparse.vmdk to [datastore1] vmware_temp/cb86b5bb-8328-4cd5-8bbd-98ee15dda909/dc3bad91-06a4-485e-bdc4-b0135384e686/dc3bad91-06a4-485e-bdc4-b0135384e686.vmdk {{(pid=59369) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 804.097201] env[59369]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-5baea874-5764-4f0d-b74e-bebc40f3a560 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 804.105230] env[59369]: DEBUG oslo_vmware.api [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] Waiting for the task: (returnval){ [ 804.105230] env[59369]: value = "task-463306" [ 804.105230] env[59369]: _type = "Task" [ 804.105230] env[59369]: } to complete. {{(pid=59369) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 804.112643] env[59369]: DEBUG oslo_vmware.api [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] Task: {'id': task-463306, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59369) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 804.168862] env[59369]: DEBUG nova.compute.manager [req-f68e643f-e24c-4eec-8d81-e73f8ec8b7b4 req-ad9e4e4b-10cc-4be6-807e-23781b493df0 service nova] [instance: d501202a-354c-42e1-8480-f026d5216a58] Received event network-vif-deleted-92977e32-ac33-400b-a6d8-cd7726581aff {{(pid=59369) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 804.615345] env[59369]: DEBUG oslo_vmware.exceptions [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] Fault InvalidArgument not matched. {{(pid=59369) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 804.615620] env[59369]: DEBUG oslo_concurrency.lockutils [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] Releasing lock "[datastore1] devstack-image-cache_base/dc3bad91-06a4-485e-bdc4-b0135384e686/dc3bad91-06a4-485e-bdc4-b0135384e686.vmdk" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 804.616211] env[59369]: ERROR nova.compute.manager [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] [instance: eae87f89-8488-42e6-b065-1198bfbe8177] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 804.616211] env[59369]: Faults: ['InvalidArgument'] [ 804.616211] env[59369]: ERROR nova.compute.manager [instance: eae87f89-8488-42e6-b065-1198bfbe8177] Traceback (most recent call last): [ 804.616211] env[59369]: ERROR nova.compute.manager [instance: eae87f89-8488-42e6-b065-1198bfbe8177] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 804.616211] env[59369]: ERROR nova.compute.manager [instance: eae87f89-8488-42e6-b065-1198bfbe8177] yield resources [ 804.616211] env[59369]: ERROR nova.compute.manager [instance: eae87f89-8488-42e6-b065-1198bfbe8177] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 804.616211] env[59369]: ERROR nova.compute.manager [instance: eae87f89-8488-42e6-b065-1198bfbe8177] self.driver.spawn(context, instance, image_meta, [ 804.616211] env[59369]: ERROR nova.compute.manager [instance: eae87f89-8488-42e6-b065-1198bfbe8177] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 804.616211] env[59369]: ERROR nova.compute.manager [instance: eae87f89-8488-42e6-b065-1198bfbe8177] self._vmops.spawn(context, instance, image_meta, injected_files, [ 804.616211] env[59369]: ERROR nova.compute.manager [instance: eae87f89-8488-42e6-b065-1198bfbe8177] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 804.616211] env[59369]: ERROR nova.compute.manager [instance: eae87f89-8488-42e6-b065-1198bfbe8177] self._fetch_image_if_missing(context, vi) [ 804.616211] env[59369]: ERROR nova.compute.manager [instance: eae87f89-8488-42e6-b065-1198bfbe8177] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 804.616211] env[59369]: ERROR nova.compute.manager [instance: eae87f89-8488-42e6-b065-1198bfbe8177] image_cache(vi, tmp_image_ds_loc) [ 804.616211] env[59369]: ERROR nova.compute.manager [instance: eae87f89-8488-42e6-b065-1198bfbe8177] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 804.616211] env[59369]: ERROR nova.compute.manager [instance: eae87f89-8488-42e6-b065-1198bfbe8177] vm_util.copy_virtual_disk( [ 804.616211] env[59369]: ERROR nova.compute.manager [instance: eae87f89-8488-42e6-b065-1198bfbe8177] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 804.616211] env[59369]: ERROR nova.compute.manager [instance: eae87f89-8488-42e6-b065-1198bfbe8177] session._wait_for_task(vmdk_copy_task) [ 804.616211] env[59369]: ERROR nova.compute.manager [instance: eae87f89-8488-42e6-b065-1198bfbe8177] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 804.616211] env[59369]: ERROR nova.compute.manager [instance: eae87f89-8488-42e6-b065-1198bfbe8177] return self.wait_for_task(task_ref) [ 804.616211] env[59369]: ERROR nova.compute.manager [instance: eae87f89-8488-42e6-b065-1198bfbe8177] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 804.616211] env[59369]: ERROR nova.compute.manager [instance: eae87f89-8488-42e6-b065-1198bfbe8177] return evt.wait() [ 804.616211] env[59369]: ERROR nova.compute.manager [instance: eae87f89-8488-42e6-b065-1198bfbe8177] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 804.616211] env[59369]: ERROR nova.compute.manager [instance: eae87f89-8488-42e6-b065-1198bfbe8177] result = hub.switch() [ 804.616211] env[59369]: ERROR nova.compute.manager [instance: eae87f89-8488-42e6-b065-1198bfbe8177] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 804.616211] env[59369]: ERROR nova.compute.manager [instance: eae87f89-8488-42e6-b065-1198bfbe8177] return self.greenlet.switch() [ 804.616211] env[59369]: ERROR nova.compute.manager [instance: eae87f89-8488-42e6-b065-1198bfbe8177] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 804.616211] env[59369]: ERROR nova.compute.manager [instance: eae87f89-8488-42e6-b065-1198bfbe8177] self.f(*self.args, **self.kw) [ 804.616211] env[59369]: ERROR nova.compute.manager [instance: eae87f89-8488-42e6-b065-1198bfbe8177] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 804.616211] env[59369]: ERROR nova.compute.manager [instance: eae87f89-8488-42e6-b065-1198bfbe8177] raise exceptions.translate_fault(task_info.error) [ 804.616211] env[59369]: ERROR nova.compute.manager [instance: eae87f89-8488-42e6-b065-1198bfbe8177] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 804.616211] env[59369]: ERROR nova.compute.manager [instance: eae87f89-8488-42e6-b065-1198bfbe8177] Faults: ['InvalidArgument'] [ 804.616211] env[59369]: ERROR nova.compute.manager [instance: eae87f89-8488-42e6-b065-1198bfbe8177] [ 804.617156] env[59369]: INFO nova.compute.manager [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] [instance: eae87f89-8488-42e6-b065-1198bfbe8177] Terminating instance [ 804.618290] env[59369]: DEBUG oslo_concurrency.lockutils [None req-d0b49fb8-e92b-4b35-a5c2-a3aed9b6a081 tempest-ServerDiagnosticsNegativeTest-1630934522 tempest-ServerDiagnosticsNegativeTest-1630934522-project-member] Acquired lock "[datastore1] devstack-image-cache_base/dc3bad91-06a4-485e-bdc4-b0135384e686/dc3bad91-06a4-485e-bdc4-b0135384e686.vmdk" {{(pid=59369) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 804.618539] env[59369]: DEBUG nova.virt.vmwareapi.ds_util [None req-d0b49fb8-e92b-4b35-a5c2-a3aed9b6a081 tempest-ServerDiagnosticsNegativeTest-1630934522 tempest-ServerDiagnosticsNegativeTest-1630934522-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59369) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 804.619163] env[59369]: DEBUG nova.compute.manager [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] [instance: eae87f89-8488-42e6-b065-1198bfbe8177] Start destroying the instance on the hypervisor. {{(pid=59369) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 804.619353] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] [instance: eae87f89-8488-42e6-b065-1198bfbe8177] Destroying instance {{(pid=59369) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 804.619569] env[59369]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-083e4850-aaae-44cb-bd2c-eae9a222143b {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 804.623613] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-51cb7eda-4ebb-4299-ae95-fc34ea378946 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 804.635933] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] [instance: eae87f89-8488-42e6-b065-1198bfbe8177] Unregistering the VM {{(pid=59369) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 804.635933] env[59369]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-8e693539-6479-499a-a8c2-c15a7c1ca540 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 804.636613] env[59369]: DEBUG nova.virt.vmwareapi.ds_util [None req-d0b49fb8-e92b-4b35-a5c2-a3aed9b6a081 tempest-ServerDiagnosticsNegativeTest-1630934522 tempest-ServerDiagnosticsNegativeTest-1630934522-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59369) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 804.636782] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-d0b49fb8-e92b-4b35-a5c2-a3aed9b6a081 tempest-ServerDiagnosticsNegativeTest-1630934522 tempest-ServerDiagnosticsNegativeTest-1630934522-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59369) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 804.637457] env[59369]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d119a71c-ce68-461d-b938-af6cdcf90925 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 804.644645] env[59369]: DEBUG oslo_vmware.api [None req-d0b49fb8-e92b-4b35-a5c2-a3aed9b6a081 tempest-ServerDiagnosticsNegativeTest-1630934522 tempest-ServerDiagnosticsNegativeTest-1630934522-project-member] Waiting for the task: (returnval){ [ 804.644645] env[59369]: value = "session[52794835-9974-5e05-43d4-e917bd2adbc9]52fae8d9-6d9a-101a-2b73-d933f4063fe9" [ 804.644645] env[59369]: _type = "Task" [ 804.644645] env[59369]: } to complete. {{(pid=59369) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 804.654019] env[59369]: DEBUG oslo_vmware.api [None req-d0b49fb8-e92b-4b35-a5c2-a3aed9b6a081 tempest-ServerDiagnosticsNegativeTest-1630934522 tempest-ServerDiagnosticsNegativeTest-1630934522-project-member] Task: {'id': session[52794835-9974-5e05-43d4-e917bd2adbc9]52fae8d9-6d9a-101a-2b73-d933f4063fe9, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59369) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 804.716838] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] [instance: eae87f89-8488-42e6-b065-1198bfbe8177] Unregistered the VM {{(pid=59369) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 804.717072] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] [instance: eae87f89-8488-42e6-b065-1198bfbe8177] Deleting contents of the VM from datastore datastore1 {{(pid=59369) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 804.717249] env[59369]: DEBUG nova.virt.vmwareapi.ds_util [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] Deleting the datastore file [datastore1] eae87f89-8488-42e6-b065-1198bfbe8177 {{(pid=59369) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 804.717496] env[59369]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-528271a7-7eb0-4d2e-8def-665d8d0b1c9b {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 804.724853] env[59369]: DEBUG oslo_vmware.api [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] Waiting for the task: (returnval){ [ 804.724853] env[59369]: value = "task-463308" [ 804.724853] env[59369]: _type = "Task" [ 804.724853] env[59369]: } to complete. {{(pid=59369) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 804.732470] env[59369]: DEBUG oslo_vmware.api [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] Task: {'id': task-463308, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59369) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 804.749981] env[59369]: DEBUG nova.compute.manager [req-6de97a8a-306f-4839-819c-946753fb9bb1 req-f02ef34c-baf9-45fa-942f-ee862f7d3dd7 service nova] [instance: eae87f89-8488-42e6-b065-1198bfbe8177] Received event network-vif-deleted-6c523737-a1dc-4be3-a0b1-46b4186741ff {{(pid=59369) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 805.154688] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-d0b49fb8-e92b-4b35-a5c2-a3aed9b6a081 tempest-ServerDiagnosticsNegativeTest-1630934522 tempest-ServerDiagnosticsNegativeTest-1630934522-project-member] [instance: d501202a-354c-42e1-8480-f026d5216a58] Preparing fetch location {{(pid=59369) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 805.154688] env[59369]: DEBUG nova.virt.vmwareapi.ds_util [None req-d0b49fb8-e92b-4b35-a5c2-a3aed9b6a081 tempest-ServerDiagnosticsNegativeTest-1630934522 tempest-ServerDiagnosticsNegativeTest-1630934522-project-member] Creating directory with path [datastore1] vmware_temp/4e3ef633-799a-4fab-9b8a-5bc5c5745fcf/dc3bad91-06a4-485e-bdc4-b0135384e686 {{(pid=59369) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 805.154962] env[59369]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-0e0b0650-beae-4cab-a889-7237ce90ef19 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 805.166557] env[59369]: DEBUG nova.virt.vmwareapi.ds_util [None req-d0b49fb8-e92b-4b35-a5c2-a3aed9b6a081 tempest-ServerDiagnosticsNegativeTest-1630934522 tempest-ServerDiagnosticsNegativeTest-1630934522-project-member] Created directory with path [datastore1] vmware_temp/4e3ef633-799a-4fab-9b8a-5bc5c5745fcf/dc3bad91-06a4-485e-bdc4-b0135384e686 {{(pid=59369) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 805.166754] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-d0b49fb8-e92b-4b35-a5c2-a3aed9b6a081 tempest-ServerDiagnosticsNegativeTest-1630934522 tempest-ServerDiagnosticsNegativeTest-1630934522-project-member] [instance: d501202a-354c-42e1-8480-f026d5216a58] Fetch image to [datastore1] vmware_temp/4e3ef633-799a-4fab-9b8a-5bc5c5745fcf/dc3bad91-06a4-485e-bdc4-b0135384e686/tmp-sparse.vmdk {{(pid=59369) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 805.166913] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-d0b49fb8-e92b-4b35-a5c2-a3aed9b6a081 tempest-ServerDiagnosticsNegativeTest-1630934522 tempest-ServerDiagnosticsNegativeTest-1630934522-project-member] [instance: d501202a-354c-42e1-8480-f026d5216a58] Downloading image file data dc3bad91-06a4-485e-bdc4-b0135384e686 to [datastore1] vmware_temp/4e3ef633-799a-4fab-9b8a-5bc5c5745fcf/dc3bad91-06a4-485e-bdc4-b0135384e686/tmp-sparse.vmdk on the data store datastore1 {{(pid=59369) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 805.167738] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9305095c-75c9-4602-99cc-491ec51e7807 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 805.174781] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-83320e07-f538-42c0-97c3-ec7ef691ce17 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 805.183184] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1efc1c56-4acd-4a7f-bba8-0fe2a2fb155c {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 805.217858] env[59369]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-99db02a0-bd61-4fc8-86c3-646f895b7512 {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 805.223387] env[59369]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-3a2709c6-6906-4036-b358-967e224517ba {{(pid=59369) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 805.233491] env[59369]: DEBUG oslo_vmware.api [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] Task: {'id': task-463308, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.069627} completed successfully. {{(pid=59369) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 805.233722] env[59369]: DEBUG nova.virt.vmwareapi.ds_util [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] Deleted the datastore file {{(pid=59369) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 805.233888] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] [instance: eae87f89-8488-42e6-b065-1198bfbe8177] Deleted contents of the VM from datastore datastore1 {{(pid=59369) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 805.234208] env[59369]: DEBUG nova.virt.vmwareapi.vmops [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] [instance: eae87f89-8488-42e6-b065-1198bfbe8177] Instance destroyed {{(pid=59369) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 805.234277] env[59369]: INFO nova.compute.manager [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] [instance: eae87f89-8488-42e6-b065-1198bfbe8177] Took 0.61 seconds to destroy the instance on the hypervisor. [ 805.236366] env[59369]: DEBUG nova.compute.claims [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] [instance: eae87f89-8488-42e6-b065-1198bfbe8177] Aborting claim: {{(pid=59369) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 805.236667] env[59369]: DEBUG oslo_concurrency.lockutils [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 805.236916] env[59369]: DEBUG oslo_concurrency.lockutils [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 805.247944] env[59369]: DEBUG nova.virt.vmwareapi.images [None req-d0b49fb8-e92b-4b35-a5c2-a3aed9b6a081 tempest-ServerDiagnosticsNegativeTest-1630934522 tempest-ServerDiagnosticsNegativeTest-1630934522-project-member] [instance: d501202a-354c-42e1-8480-f026d5216a58] Downloading image file data dc3bad91-06a4-485e-bdc4-b0135384e686 to the data store datastore1 {{(pid=59369) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 805.264369] env[59369]: DEBUG oslo_concurrency.lockutils [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.027s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 805.265153] env[59369]: DEBUG nova.compute.utils [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] [instance: eae87f89-8488-42e6-b065-1198bfbe8177] Instance eae87f89-8488-42e6-b065-1198bfbe8177 could not be found. {{(pid=59369) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 805.266647] env[59369]: DEBUG nova.compute.manager [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] [instance: eae87f89-8488-42e6-b065-1198bfbe8177] Instance disappeared during build. {{(pid=59369) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 805.266802] env[59369]: DEBUG nova.compute.manager [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] [instance: eae87f89-8488-42e6-b065-1198bfbe8177] Unplugging VIFs for instance {{(pid=59369) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 805.266989] env[59369]: DEBUG nova.compute.manager [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59369) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 805.267115] env[59369]: DEBUG nova.compute.manager [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] [instance: eae87f89-8488-42e6-b065-1198bfbe8177] Deallocating network for instance {{(pid=59369) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 805.267268] env[59369]: DEBUG nova.network.neutron [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] [instance: eae87f89-8488-42e6-b065-1198bfbe8177] deallocate_for_instance() {{(pid=59369) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 805.296652] env[59369]: DEBUG nova.network.neutron [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] [instance: eae87f89-8488-42e6-b065-1198bfbe8177] Updating instance_info_cache with network_info: [] {{(pid=59369) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 805.306157] env[59369]: DEBUG oslo_vmware.rw_handles [None req-d0b49fb8-e92b-4b35-a5c2-a3aed9b6a081 tempest-ServerDiagnosticsNegativeTest-1630934522 tempest-ServerDiagnosticsNegativeTest-1630934522-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/4e3ef633-799a-4fab-9b8a-5bc5c5745fcf/dc3bad91-06a4-485e-bdc4-b0135384e686/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59369) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 805.360761] env[59369]: INFO nova.compute.manager [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] [instance: eae87f89-8488-42e6-b065-1198bfbe8177] Took 0.09 seconds to deallocate network for instance. [ 805.365642] env[59369]: DEBUG oslo_vmware.rw_handles [None req-d0b49fb8-e92b-4b35-a5c2-a3aed9b6a081 tempest-ServerDiagnosticsNegativeTest-1630934522 tempest-ServerDiagnosticsNegativeTest-1630934522-project-member] Completed reading data from the image iterator. {{(pid=59369) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 805.365846] env[59369]: DEBUG oslo_vmware.rw_handles [None req-d0b49fb8-e92b-4b35-a5c2-a3aed9b6a081 tempest-ServerDiagnosticsNegativeTest-1630934522 tempest-ServerDiagnosticsNegativeTest-1630934522-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/4e3ef633-799a-4fab-9b8a-5bc5c5745fcf/dc3bad91-06a4-485e-bdc4-b0135384e686/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59369) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 805.366552] env[59369]: DEBUG oslo_concurrency.lockutils [None req-5ac3ff81-d061-46b9-97b1-ef6ac74b199e tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] Acquiring lock "0d3861c9-e71e-4dd7-af12-fe3908589500" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 805.367041] env[59369]: DEBUG oslo_concurrency.lockutils [None req-5ac3ff81-d061-46b9-97b1-ef6ac74b199e tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] Lock "0d3861c9-e71e-4dd7-af12-fe3908589500" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 805.406756] env[59369]: DEBUG oslo_concurrency.lockutils [None req-1a94da33-2a90-4741-8058-501a8c4b2945 tempest-DeleteServersAdminTestJSON-890964930 tempest-DeleteServersAdminTestJSON-890964930-project-member] Lock "eae87f89-8488-42e6-b065-1198bfbe8177" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 202.491s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 805.408320] env[59369]: DEBUG oslo_concurrency.lockutils [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Lock "eae87f89-8488-42e6-b065-1198bfbe8177" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 194.934s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 805.410014] env[59369]: INFO nova.compute.manager [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] [instance: eae87f89-8488-42e6-b065-1198bfbe8177] During sync_power_state the instance has a pending task (spawning). Skip. [ 805.410014] env[59369]: DEBUG oslo_concurrency.lockutils [None req-ee46ec64-eaeb-4d64-8834-02cb1973eac8 None None] Lock "eae87f89-8488-42e6-b065-1198bfbe8177" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=59369) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 805.417775] env[59369]: DEBUG nova.compute.manager [None req-edb64f36-6ae4-4beb-b34c-bb6b27abdd85 tempest-MultipleCreateTestJSON-1252325267 tempest-MultipleCreateTestJSON-1252325267-project-member] [instance: f6b26955-b900-40d2-a910-57a4435c629c] Starting instance... {{(pid=59369) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 805.473968] env[59369]: DEBUG oslo_concurrency.lockutils [None req-edb64f36-6ae4-4beb-b34c-bb6b27abdd85 tempest-MultipleCreateTestJSON-1252325267 tempest-MultipleCreateTestJSON-1252325267-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker