Apr 21 13:53:44 user nova-compute[71474]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. Apr 21 13:53:47 user nova-compute[71474]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' {{(pid=71474) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} Apr 21 13:53:47 user nova-compute[71474]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' {{(pid=71474) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} Apr 21 13:53:47 user nova-compute[71474]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' {{(pid=71474) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} Apr 21 13:53:47 user nova-compute[71474]: INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs Apr 21 13:53:47 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:53:47 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.019s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:53:47 user nova-compute[71474]: INFO nova.virt.driver [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] Loading compute driver 'libvirt.LibvirtDriver' Apr 21 13:53:48 user nova-compute[71474]: INFO nova.compute.provider_config [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access. Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] Acquiring lock "singleton_lock" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] Acquired lock "singleton_lock" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] Releasing lock "singleton_lock" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] Full set of CONF: {{(pid=71474) _wait_for_exit_or_signal /usr/local/lib/python3.10/dist-packages/oslo_service/service.py:362}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] ******************************************************************************** {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2589}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] Configuration options gathered from: {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2590}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] command line args: ['--config-file', '/etc/nova/nova-cpu.conf'] {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2591}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] config files: ['/etc/nova/nova-cpu.conf'] {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2592}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] ================================================================================ {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2594}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] allow_resize_to_same_host = True {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] arq_binding_timeout = 300 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] backdoor_port = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] backdoor_socket = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] block_device_allocate_retries = 300 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] block_device_allocate_retries_interval = 5 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] cert = self.pem {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] compute_driver = libvirt.LibvirtDriver {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] compute_monitors = [] {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] config_dir = [] {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] config_drive_format = iso9660 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] config_file = ['/etc/nova/nova-cpu.conf'] {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] config_source = [] {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] console_host = user {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] control_exchange = nova {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] cpu_allocation_ratio = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] daemon = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] debug = True {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] default_access_ip_network_name = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] default_availability_zone = nova {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] default_ephemeral_format = ext4 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] default_schedule_zone = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] disk_allocation_ratio = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] enable_new_services = True {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] enabled_apis = ['osapi_compute'] {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] enabled_ssl_apis = [] {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] flat_injected = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] force_config_drive = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] force_raw_images = True {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] graceful_shutdown_timeout = 5 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] heal_instance_info_cache_interval = 60 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] host = user {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] initial_cpu_allocation_ratio = 4.0 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] initial_disk_allocation_ratio = 1.0 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] initial_ram_allocation_ratio = 1.0 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] injected_network_template = /opt/stack/nova/nova/virt/interfaces.template {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] instance_build_timeout = 0 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] instance_delete_interval = 300 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] instance_format = [instance: %(uuid)s] {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] instance_name_template = instance-%08x {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] instance_usage_audit = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] instance_usage_audit_period = month {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] instance_uuid_format = [instance: %(uuid)s] {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] instances_path = /opt/stack/data/nova/instances {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] internal_service_availability_zone = internal {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] key = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] live_migration_retry_count = 30 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] log_config_append = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] log_date_format = %Y-%m-%d %H:%M:%S {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] log_dir = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] log_file = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] log_options = True {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] log_rotate_interval = 1 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] log_rotate_interval_type = days {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] log_rotation_type = none {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] logging_context_format_string = %(color)s%(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(project_name)s %(user_name)s%(color)s] %(instance)s%(color)s%(message)s {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] logging_debug_format_suffix = {{(pid=%(process)d) %(funcName)s %(pathname)s:%(lineno)d}} {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] logging_default_format_string = %(color)s%(levelname)s %(name)s [-%(color)s] %(instance)s%(color)s%(message)s {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] logging_exception_prefix = ERROR %(name)s %(instance)s {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] long_rpc_timeout = 1800 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] max_concurrent_builds = 10 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] max_concurrent_live_migrations = 1 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] max_concurrent_snapshots = 5 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] max_local_block_devices = 3 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] max_logfile_count = 30 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] max_logfile_size_mb = 200 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] maximum_instance_delete_attempts = 5 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] metadata_listen = 0.0.0.0 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] metadata_listen_port = 8775 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] metadata_workers = 3 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] migrate_max_retries = -1 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] mkisofs_cmd = genisoimage {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] my_block_storage_ip = 10.0.0.210 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] my_ip = 10.0.0.210 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] network_allocate_retries = 0 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] osapi_compute_listen = 0.0.0.0 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] osapi_compute_listen_port = 8774 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] osapi_compute_unique_server_name_scope = {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] osapi_compute_workers = 3 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] password_length = 12 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] periodic_enable = True {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] periodic_fuzzy_delay = 60 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] pointer_model = ps2mouse {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] preallocate_images = none {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] publish_errors = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] pybasedir = /opt/stack/nova {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] ram_allocation_ratio = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] rate_limit_burst = 0 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] rate_limit_except_level = CRITICAL {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] rate_limit_interval = 0 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] reboot_timeout = 0 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] reclaim_instance_interval = 0 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] record = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] reimage_timeout_per_gb = 20 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] report_interval = 10 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] rescue_timeout = 0 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] reserved_host_cpus = 0 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] reserved_host_disk_mb = 0 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] reserved_host_memory_mb = 512 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] reserved_huge_pages = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] resize_confirm_window = 0 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] resize_fs_using_block_device = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] resume_guests_state_on_host_boot = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] rootwrap_config = /etc/nova/rootwrap.conf {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] rpc_response_timeout = 60 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] run_external_periodic_tasks = True {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] running_deleted_instance_action = reap {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] running_deleted_instance_poll_interval = 1800 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] running_deleted_instance_timeout = 0 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] scheduler_instance_sync_interval = 120 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] service_down_time = 60 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] servicegroup_driver = db {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] shelved_offload_time = 0 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] shelved_poll_interval = 3600 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] shutdown_timeout = 0 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] source_is_ipv6 = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] ssl_only = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] state_path = /opt/stack/data/nova {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] sync_power_state_interval = 600 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] sync_power_state_pool_size = 1000 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] syslog_log_facility = LOG_USER {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] tempdir = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] timeout_nbd = 10 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] transport_url = **** {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] update_resources_interval = 0 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] use_cow_images = True {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] use_eventlog = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] use_journal = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] use_json = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] use_rootwrap_daemon = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] use_stderr = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] use_syslog = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] vcpu_pin_set = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] vif_plugging_is_fatal = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] vif_plugging_timeout = 0 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] virt_mkfs = [] {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] volume_usage_poll_interval = 0 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] watch_log_file = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] web = /usr/share/spice-html5 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_concurrency.disable_process_locking = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_concurrency.lock_path = /opt/stack/data/nova {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_messaging_metrics.metrics_buffer_size = 1000 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_messaging_metrics.metrics_enabled = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_messaging_metrics.metrics_process_name = {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] api.auth_strategy = keystone {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] api.compute_link_prefix = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] api.dhcp_domain = novalocal {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] api.enable_instance_password = True {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] api.glance_link_prefix = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] api.instance_list_cells_batch_fixed_size = 100 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] api.instance_list_cells_batch_strategy = distributed {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] api.instance_list_per_project_cells = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] api.list_records_by_skipping_down_cells = True {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] api.local_metadata_per_cell = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] api.max_limit = 1000 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] api.metadata_cache_expiration = 15 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] api.neutron_default_tenant_id = default {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] api.use_forwarded_for = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] api.use_neutron_default_nets = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] api.vendordata_dynamic_connect_timeout = 5 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] api.vendordata_dynamic_failure_fatal = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] api.vendordata_dynamic_read_timeout = 5 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] api.vendordata_dynamic_ssl_certfile = {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] api.vendordata_dynamic_targets = [] {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] api.vendordata_jsonfile_path = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] api.vendordata_providers = ['StaticJSON'] {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] cache.backend = dogpile.cache.memcached {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] cache.backend_argument = **** {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] cache.config_prefix = cache.oslo {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] cache.dead_timeout = 60.0 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] cache.debug_cache_backend = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] cache.enable_retry_client = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] cache.enable_socket_keepalive = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] cache.enabled = True {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] cache.expiration_time = 600 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] cache.hashclient_retry_attempts = 2 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] cache.hashclient_retry_delay = 1.0 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] cache.memcache_dead_retry = 300 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] cache.memcache_password = {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] cache.memcache_pool_connection_get_timeout = 10 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] cache.memcache_pool_flush_on_reconnect = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] cache.memcache_pool_maxsize = 10 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] cache.memcache_pool_unused_timeout = 60 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] cache.memcache_sasl_enabled = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] cache.memcache_servers = ['localhost:11211'] {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] cache.memcache_socket_timeout = 1.0 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] cache.memcache_username = {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] cache.proxies = [] {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] cache.retry_attempts = 2 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] cache.retry_delay = 0.0 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] cache.socket_keepalive_count = 1 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] cache.socket_keepalive_idle = 1 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] cache.socket_keepalive_interval = 1 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] cache.tls_allowed_ciphers = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] cache.tls_cafile = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] cache.tls_certfile = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] cache.tls_enabled = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] cache.tls_keyfile = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] cinder.auth_section = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] cinder.auth_type = password {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] cinder.cafile = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] cinder.catalog_info = volumev3::publicURL {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] cinder.certfile = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] cinder.collect_timing = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] cinder.cross_az_attach = True {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] cinder.debug = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] cinder.endpoint_template = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] cinder.http_retries = 3 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] cinder.insecure = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] cinder.keyfile = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] cinder.os_region_name = RegionOne {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] cinder.split_loggers = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] cinder.timeout = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] compute.consecutive_build_service_disable_threshold = 10 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] compute.cpu_dedicated_set = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] compute.cpu_shared_set = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] compute.image_type_exclude_list = [] {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] compute.live_migration_wait_for_vif_plug = True {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] compute.max_concurrent_disk_ops = 0 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] compute.max_disk_devices_to_attach = -1 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] compute.packing_host_numa_cells_allocation_strategy = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] compute.provider_config_location = /etc/nova/provider_config/ {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] compute.resource_provider_association_refresh = 300 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] compute.shutdown_retry_interval = 10 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] conductor.workers = 3 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] console.allowed_origins = [] {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] console.ssl_ciphers = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] console.ssl_minimum_version = default {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] consoleauth.token_ttl = 600 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] cyborg.cafile = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] cyborg.certfile = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] cyborg.collect_timing = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] cyborg.connect_retries = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] cyborg.connect_retry_delay = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] cyborg.endpoint_override = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] cyborg.insecure = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] cyborg.keyfile = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] cyborg.max_version = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] cyborg.min_version = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] cyborg.region_name = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] cyborg.service_name = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] cyborg.service_type = accelerator {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] cyborg.split_loggers = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] cyborg.status_code_retries = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] cyborg.status_code_retry_delay = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] cyborg.timeout = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] cyborg.valid_interfaces = ['internal', 'public'] {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] cyborg.version = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] database.backend = sqlalchemy {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] database.connection = **** {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] database.connection_debug = 0 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] database.connection_parameters = {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] database.connection_recycle_time = 3600 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] database.connection_trace = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] database.db_inc_retry_interval = True {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] database.db_max_retries = 20 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] database.db_max_retry_interval = 10 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] database.db_retry_interval = 1 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] database.max_overflow = 50 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] database.max_pool_size = 5 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] database.max_retries = 10 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] database.mysql_enable_ndb = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] database.mysql_sql_mode = TRADITIONAL {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] database.mysql_wsrep_sync_wait = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] database.pool_timeout = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] database.retry_interval = 10 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] database.slave_connection = **** {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] database.sqlite_synchronous = True {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] api_database.backend = sqlalchemy {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] api_database.connection = **** {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] api_database.connection_debug = 0 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] api_database.connection_parameters = {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] api_database.connection_recycle_time = 3600 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] api_database.connection_trace = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] api_database.db_inc_retry_interval = True {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] api_database.db_max_retries = 20 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] api_database.db_max_retry_interval = 10 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] api_database.db_retry_interval = 1 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] api_database.max_overflow = 50 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] api_database.max_pool_size = 5 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] api_database.max_retries = 10 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] api_database.mysql_enable_ndb = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] api_database.mysql_sql_mode = TRADITIONAL {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] api_database.mysql_wsrep_sync_wait = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] api_database.pool_timeout = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] api_database.retry_interval = 10 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] api_database.slave_connection = **** {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] api_database.sqlite_synchronous = True {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] devices.enabled_mdev_types = [] {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] ephemeral_storage_encryption.cipher = aes-xts-plain64 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] ephemeral_storage_encryption.enabled = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] ephemeral_storage_encryption.key_size = 512 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] glance.api_servers = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] glance.cafile = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] glance.certfile = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] glance.collect_timing = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] glance.connect_retries = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] glance.connect_retry_delay = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] glance.debug = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] glance.default_trusted_certificate_ids = [] {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] glance.enable_certificate_validation = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] glance.enable_rbd_download = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] glance.endpoint_override = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] glance.insecure = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] glance.keyfile = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] glance.max_version = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] glance.min_version = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] glance.num_retries = 3 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] glance.rbd_ceph_conf = {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] glance.rbd_connect_timeout = 5 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] glance.rbd_pool = {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] glance.rbd_user = {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] glance.region_name = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] glance.service_name = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] glance.service_type = image {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] glance.split_loggers = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] glance.status_code_retries = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] glance.status_code_retry_delay = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] glance.timeout = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] glance.valid_interfaces = ['internal', 'public'] {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] glance.verify_glance_signatures = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] glance.version = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] guestfs.debug = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] hyperv.config_drive_cdrom = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] hyperv.config_drive_inject_password = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] hyperv.dynamic_memory_ratio = 1.0 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] hyperv.enable_instance_metrics_collection = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] hyperv.enable_remotefx = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] hyperv.instances_path_share = {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] hyperv.iscsi_initiator_list = [] {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] hyperv.limit_cpu_features = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] hyperv.mounted_disk_query_retry_count = 10 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] hyperv.mounted_disk_query_retry_interval = 5 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] hyperv.power_state_check_timeframe = 60 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] hyperv.power_state_event_polling_interval = 2 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] hyperv.qemu_img_cmd = qemu-img.exe {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] hyperv.use_multipath_io = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] hyperv.volume_attach_retry_count = 10 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] hyperv.volume_attach_retry_interval = 5 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] hyperv.vswitch_name = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] hyperv.wait_soft_reboot_seconds = 60 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] mks.enabled = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] mks.mksproxy_base_url = http://127.0.0.1:6090/ {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] image_cache.manager_interval = 2400 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] image_cache.precache_concurrency = 1 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] image_cache.remove_unused_base_images = True {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] image_cache.remove_unused_original_minimum_age_seconds = 86400 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] image_cache.remove_unused_resized_minimum_age_seconds = 3600 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] image_cache.subdirectory_name = _base {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] ironic.api_max_retries = 60 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] ironic.api_retry_interval = 2 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] ironic.auth_section = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] ironic.auth_type = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] ironic.cafile = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] ironic.certfile = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] ironic.collect_timing = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] ironic.connect_retries = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] ironic.connect_retry_delay = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] ironic.endpoint_override = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] ironic.insecure = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] ironic.keyfile = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] ironic.max_version = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] ironic.min_version = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] ironic.partition_key = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] ironic.peer_list = [] {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] ironic.region_name = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] ironic.serial_console_state_timeout = 10 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] ironic.service_name = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] ironic.service_type = baremetal {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] ironic.split_loggers = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] ironic.status_code_retries = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] ironic.status_code_retry_delay = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] ironic.timeout = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] ironic.valid_interfaces = ['internal', 'public'] {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] ironic.version = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] key_manager.backend = nova.keymgr.conf_key_mgr.ConfKeyManager {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] key_manager.fixed_key = **** {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] barbican.auth_endpoint = http://localhost/identity/v3 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] barbican.barbican_api_version = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] barbican.barbican_endpoint = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] barbican.barbican_endpoint_type = public {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] barbican.barbican_region_name = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] barbican.cafile = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] barbican.certfile = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] barbican.collect_timing = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] barbican.insecure = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] barbican.keyfile = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] barbican.number_of_retries = 60 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] barbican.retry_delay = 1 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] barbican.send_service_user_token = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] barbican.split_loggers = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] barbican.timeout = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] barbican.verify_ssl = True {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] barbican.verify_ssl_path = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] barbican_service_user.auth_section = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] barbican_service_user.auth_type = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] barbican_service_user.cafile = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] barbican_service_user.certfile = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] barbican_service_user.collect_timing = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] barbican_service_user.insecure = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] barbican_service_user.keyfile = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] barbican_service_user.split_loggers = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] barbican_service_user.timeout = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] vault.approle_role_id = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] vault.approle_secret_id = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] vault.cafile = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] vault.certfile = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] vault.collect_timing = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] vault.insecure = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] vault.keyfile = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] vault.kv_mountpoint = secret {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] vault.kv_version = 2 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] vault.namespace = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] vault.root_token_id = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] vault.split_loggers = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] vault.ssl_ca_crt_file = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] vault.timeout = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] vault.use_ssl = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] vault.vault_url = http://127.0.0.1:8200 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] keystone.cafile = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] keystone.certfile = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] keystone.collect_timing = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] keystone.connect_retries = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] keystone.connect_retry_delay = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] keystone.endpoint_override = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] keystone.insecure = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] keystone.keyfile = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] keystone.max_version = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] keystone.min_version = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] keystone.region_name = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] keystone.service_name = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] keystone.service_type = identity {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] keystone.split_loggers = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] keystone.status_code_retries = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] keystone.status_code_retry_delay = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] keystone.timeout = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] keystone.valid_interfaces = ['internal', 'public'] {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] keystone.version = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.connection_uri = {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.cpu_mode = custom {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.cpu_model_extra_flags = [] {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: WARNING oslo_config.cfg [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] Deprecated: Option "cpu_model" from group "libvirt" is deprecated. Use option "cpu_models" from group "libvirt". Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.cpu_models = ['Nehalem'] {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.cpu_power_governor_high = performance {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.cpu_power_governor_low = powersave {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.cpu_power_management = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.cpu_power_management_strategy = cpu_state {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.device_detach_attempts = 8 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.device_detach_timeout = 20 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.disk_cachemodes = [] {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.disk_prefix = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.enabled_perf_events = [] {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.file_backed_memory = 0 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.gid_maps = [] {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.hw_disk_discard = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.hw_machine_type = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.images_rbd_ceph_conf = {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.images_rbd_glance_copy_poll_interval = 15 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.images_rbd_glance_copy_timeout = 600 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.images_rbd_glance_store_name = {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.images_rbd_pool = rbd {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.images_type = default {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.images_volume_group = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.inject_key = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.inject_partition = -2 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.inject_password = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.iscsi_iface = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.iser_use_multipath = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.live_migration_bandwidth = 0 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.live_migration_completion_timeout = 800 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.live_migration_downtime = 500 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.live_migration_downtime_delay = 75 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.live_migration_downtime_steps = 10 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.live_migration_inbound_addr = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.live_migration_permit_auto_converge = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.live_migration_permit_post_copy = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.live_migration_scheme = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.live_migration_timeout_action = abort {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.live_migration_tunnelled = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: WARNING oslo_config.cfg [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal ( Apr 21 13:53:48 user nova-compute[71474]: live_migration_uri is deprecated for removal in favor of two other options that Apr 21 13:53:48 user nova-compute[71474]: allow to change live migration scheme and target URI: ``live_migration_scheme`` Apr 21 13:53:48 user nova-compute[71474]: and ``live_migration_inbound_addr`` respectively. Apr 21 13:53:48 user nova-compute[71474]: ). Its value may be silently ignored in the future. Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.live_migration_uri = qemu+ssh://stack@%s/system {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.live_migration_with_native_tls = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.max_queues = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.mem_stats_period_seconds = 10 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.nfs_mount_options = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.nfs_mount_point_base = /opt/stack/data/nova/mnt {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.num_aoe_discover_tries = 3 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.num_iser_scan_tries = 5 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.num_memory_encrypted_guests = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.num_nvme_discover_tries = 5 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.num_pcie_ports = 0 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.num_volume_scan_tries = 5 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.pmem_namespaces = [] {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.quobyte_client_cfg = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.quobyte_mount_point_base = /opt/stack/data/nova/mnt {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.rbd_connect_timeout = 5 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.rbd_destroy_volume_retries = 12 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.rbd_destroy_volume_retry_interval = 5 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.rbd_secret_uuid = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.rbd_user = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.realtime_scheduler_priority = 1 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.remote_filesystem_transport = ssh {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.rescue_image_id = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.rescue_kernel_id = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.rescue_ramdisk_id = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.rng_dev_path = /dev/urandom {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.rx_queue_size = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.smbfs_mount_options = {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.smbfs_mount_point_base = /opt/stack/data/nova/mnt {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.snapshot_compression = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.snapshot_image_format = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.snapshots_directory = /opt/stack/data/nova/instances/snapshots {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.sparse_logical_volumes = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.swtpm_enabled = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.swtpm_group = tss {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.swtpm_user = tss {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.sysinfo_serial = unique {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.tx_queue_size = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.uid_maps = [] {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.use_virtio_for_bridges = True {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.virt_type = kvm {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.volume_clear = zero {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.volume_clear_size = 0 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.volume_use_multipath = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.vzstorage_cache_path = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.vzstorage_mount_group = qemu {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.vzstorage_mount_opts = [] {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.vzstorage_mount_perms = 0770 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.vzstorage_mount_point_base = /opt/stack/data/nova/mnt {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.vzstorage_mount_user = stack {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] libvirt.wait_soft_reboot_seconds = 120 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] neutron.auth_section = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] neutron.auth_type = password {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] neutron.cafile = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] neutron.certfile = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] neutron.collect_timing = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] neutron.connect_retries = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] neutron.connect_retry_delay = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] neutron.default_floating_pool = public {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] neutron.endpoint_override = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] neutron.extension_sync_interval = 600 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] neutron.http_retries = 3 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] neutron.insecure = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] neutron.keyfile = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] neutron.max_version = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] neutron.metadata_proxy_shared_secret = **** {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] neutron.min_version = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] neutron.ovs_bridge = br-int {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] neutron.physnets = [] {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] neutron.region_name = RegionOne {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] neutron.service_metadata_proxy = True {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] neutron.service_name = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] neutron.service_type = network {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] neutron.split_loggers = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] neutron.status_code_retries = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] neutron.status_code_retry_delay = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] neutron.timeout = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] neutron.valid_interfaces = ['internal', 'public'] {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] neutron.version = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] notifications.bdms_in_notifications = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] notifications.default_level = INFO {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] notifications.notification_format = unversioned {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] notifications.notify_on_state_change = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] notifications.versioned_notifications_topics = ['versioned_notifications'] {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] pci.alias = [] {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] pci.device_spec = [] {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] pci.report_in_placement = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] placement.auth_section = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] placement.auth_type = password {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] placement.auth_url = http://10.0.0.210/identity {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] placement.cafile = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] placement.certfile = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] placement.collect_timing = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] placement.connect_retries = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] placement.connect_retry_delay = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] placement.default_domain_id = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] placement.default_domain_name = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] placement.domain_id = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] placement.domain_name = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] placement.endpoint_override = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] placement.insecure = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] placement.keyfile = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] placement.max_version = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] placement.min_version = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] placement.password = **** {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] placement.project_domain_id = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] placement.project_domain_name = Default {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] placement.project_id = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] placement.project_name = service {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] placement.region_name = RegionOne {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] placement.service_name = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] placement.service_type = placement {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] placement.split_loggers = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] placement.status_code_retries = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] placement.status_code_retry_delay = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] placement.system_scope = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] placement.timeout = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] placement.trust_id = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] placement.user_domain_id = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] placement.user_domain_name = Default {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] placement.user_id = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] placement.username = placement {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] placement.valid_interfaces = ['internal', 'public'] {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] placement.version = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] quota.cores = 20 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] quota.count_usage_from_placement = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] quota.driver = nova.quota.DbQuotaDriver {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] quota.injected_file_content_bytes = 10240 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] quota.injected_file_path_length = 255 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] quota.injected_files = 5 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] quota.instances = 10 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] quota.key_pairs = 100 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] quota.metadata_items = 128 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] quota.ram = 51200 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] quota.recheck_quota = True {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] quota.server_group_members = 10 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] quota.server_groups = 10 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] rdp.enabled = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] scheduler.discover_hosts_in_cells_interval = -1 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] scheduler.enable_isolated_aggregate_filtering = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] scheduler.image_metadata_prefilter = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] scheduler.limit_tenants_to_placement_aggregate = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] scheduler.max_attempts = 3 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] scheduler.max_placement_results = 1000 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] scheduler.placement_aggregate_required_for_tenants = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] scheduler.query_placement_for_availability_zone = True {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] scheduler.query_placement_for_image_type_support = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] scheduler.query_placement_for_routed_network_aggregates = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] scheduler.workers = 3 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] filter_scheduler.aggregate_image_properties_isolation_namespace = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] filter_scheduler.aggregate_image_properties_isolation_separator = . {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] filter_scheduler.build_failure_weight_multiplier = 1000000.0 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] filter_scheduler.cpu_weight_multiplier = 1.0 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] filter_scheduler.disk_weight_multiplier = 1.0 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] filter_scheduler.enabled_filters = ['AvailabilityZoneFilter', 'ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter', 'SameHostFilter', 'DifferentHostFilter'] {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] filter_scheduler.host_subset_size = 1 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] filter_scheduler.image_properties_default_architecture = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] filter_scheduler.io_ops_weight_multiplier = -1.0 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] filter_scheduler.isolated_hosts = [] {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] filter_scheduler.isolated_images = [] {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] filter_scheduler.max_instances_per_host = 50 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] filter_scheduler.max_io_ops_per_host = 8 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] filter_scheduler.pci_in_placement = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] filter_scheduler.pci_weight_multiplier = 1.0 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] filter_scheduler.ram_weight_multiplier = 1.0 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] filter_scheduler.shuffle_best_same_weighed_hosts = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] filter_scheduler.soft_affinity_weight_multiplier = 1.0 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] filter_scheduler.track_instance_changes = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] metrics.required = True {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] metrics.weight_multiplier = 1.0 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] metrics.weight_of_unavailable = -10000.0 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] metrics.weight_setting = [] {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] serial_console.base_url = ws://127.0.0.1:6083/ {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] serial_console.enabled = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] serial_console.port_range = 10000:20000 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] serial_console.proxyclient_address = 127.0.0.1 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] serial_console.serialproxy_host = 0.0.0.0 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] serial_console.serialproxy_port = 6083 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] service_user.auth_section = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] service_user.auth_type = password {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] service_user.cafile = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] service_user.certfile = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] service_user.collect_timing = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] service_user.insecure = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] service_user.keyfile = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] service_user.send_service_user_token = True {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] service_user.split_loggers = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] service_user.timeout = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] spice.agent_enabled = True {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] spice.enabled = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] spice.html5proxy_base_url = http://10.0.0.210:6081/spice_auto.html {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] spice.html5proxy_host = 0.0.0.0 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] spice.html5proxy_port = 6082 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] spice.image_compression = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] spice.jpeg_compression = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] spice.playback_compression = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] spice.server_listen = 127.0.0.1 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] spice.server_proxyclient_address = 127.0.0.1 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] spice.streaming_mode = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] spice.zlib_compression = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] upgrade_levels.baseapi = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] upgrade_levels.cert = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] upgrade_levels.compute = auto {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] upgrade_levels.conductor = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] upgrade_levels.scheduler = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] vendordata_dynamic_auth.auth_section = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] vendordata_dynamic_auth.auth_type = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] vendordata_dynamic_auth.cafile = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] vendordata_dynamic_auth.certfile = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] vendordata_dynamic_auth.collect_timing = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] vendordata_dynamic_auth.insecure = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] vendordata_dynamic_auth.keyfile = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] vendordata_dynamic_auth.split_loggers = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] vendordata_dynamic_auth.timeout = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] vmware.api_retry_count = 10 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] vmware.ca_file = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] vmware.cache_prefix = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] vmware.cluster_name = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] vmware.connection_pool_size = 10 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] vmware.console_delay_seconds = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] vmware.datastore_regex = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] vmware.host_ip = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] vmware.host_password = **** {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] vmware.host_port = 443 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] vmware.host_username = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] vmware.insecure = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] vmware.integration_bridge = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] vmware.maximum_objects = 100 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] vmware.pbm_default_policy = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] vmware.pbm_enabled = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] vmware.pbm_wsdl_location = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] vmware.serial_log_dir = /opt/vmware/vspc {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] vmware.serial_port_proxy_uri = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] vmware.serial_port_service_uri = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] vmware.task_poll_interval = 0.5 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] vmware.use_linked_clone = True {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] vmware.vnc_keymap = en-us {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] vmware.vnc_port = 5900 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] vmware.vnc_port_total = 10000 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] vnc.auth_schemes = ['none'] {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] vnc.enabled = True {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] vnc.novncproxy_base_url = http://10.0.0.210:6080/vnc_lite.html {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] vnc.novncproxy_host = 0.0.0.0 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] vnc.novncproxy_port = 6080 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] vnc.server_listen = 0.0.0.0 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] vnc.server_proxyclient_address = 10.0.0.210 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] vnc.vencrypt_ca_certs = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] vnc.vencrypt_client_cert = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] vnc.vencrypt_client_key = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] workarounds.disable_compute_service_check_for_ffu = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] workarounds.disable_fallback_pcpu_query = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] workarounds.disable_group_policy_check_upcall = True {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] workarounds.disable_libvirt_livesnapshot = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] workarounds.disable_rootwrap = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] workarounds.enable_numa_live_migration = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] workarounds.enable_qemu_monitor_announce_self = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] workarounds.handle_virt_lifecycle_events = True {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] workarounds.libvirt_disable_apic = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] workarounds.never_download_image_if_on_rbd = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] workarounds.qemu_monitor_announce_self_count = 3 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] workarounds.qemu_monitor_announce_self_interval = 1 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] workarounds.reserve_disk_resource_for_image_cache = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] workarounds.skip_cpu_compare_at_startup = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] workarounds.skip_cpu_compare_on_dest = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] workarounds.skip_hypervisor_version_check_on_lm = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] workarounds.skip_reserve_in_use_ironic_nodes = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] workarounds.unified_limits_count_pcpu_as_vcpu = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] wsgi.api_paste_config = /etc/nova/api-paste.ini {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] wsgi.client_socket_timeout = 900 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] wsgi.default_pool_size = 1000 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] wsgi.keep_alive = True {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] wsgi.max_header_line = 16384 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] wsgi.secure_proxy_ssl_header = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] wsgi.ssl_ca_file = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] wsgi.ssl_cert_file = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] wsgi.ssl_key_file = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] wsgi.tcp_keepidle = 600 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] zvm.ca_file = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] zvm.cloud_connector_url = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] zvm.image_tmp_path = /opt/stack/data/nova/images {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] zvm.reachable_timeout = 300 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_policy.enforce_new_defaults = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_policy.enforce_scope = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_policy.policy_default_rule = default {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_policy.policy_dirs = ['policy.d'] {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_policy.policy_file = policy.yaml {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_policy.remote_content_type = application/x-www-form-urlencoded {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_policy.remote_ssl_ca_crt_file = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_policy.remote_ssl_client_crt_file = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_policy.remote_ssl_client_key_file = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_policy.remote_ssl_verify_server_crt = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_versionedobjects.fatal_exception_format_errors = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] profiler.connection_string = messaging:// {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] profiler.enabled = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] profiler.es_doc_type = notification {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] profiler.es_scroll_size = 10000 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] profiler.es_scroll_time = 2m {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] profiler.filter_error_trace = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] profiler.hmac_keys = SECRET_KEY {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] profiler.sentinel_service_name = mymaster {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] profiler.socket_timeout = 0.1 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] profiler.trace_sqlalchemy = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] remote_debug.host = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] remote_debug.port = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_messaging_rabbit.amqp_auto_delete = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_messaging_rabbit.amqp_durable_queues = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_messaging_rabbit.conn_pool_min_size = 2 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_messaging_rabbit.conn_pool_ttl = 1200 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_messaging_rabbit.direct_mandatory_flag = True {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_messaging_rabbit.enable_cancel_on_failover = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_messaging_rabbit.heartbeat_in_pthread = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_messaging_rabbit.heartbeat_rate = 2 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_messaging_rabbit.kombu_compression = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_messaging_rabbit.kombu_failover_strategy = round-robin {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_messaging_rabbit.rabbit_ha_queues = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_messaging_rabbit.rabbit_interval_max = 30 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_messaging_rabbit.rabbit_quorum_queue = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_messaging_rabbit.rabbit_quroum_max_memory_bytes = 0 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_messaging_rabbit.rabbit_quroum_max_memory_length = 0 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_messaging_rabbit.rabbit_retry_backoff = 2 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_messaging_rabbit.rabbit_retry_interval = 1 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_messaging_rabbit.rpc_conn_pool_size = 30 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_messaging_rabbit.ssl = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_messaging_rabbit.ssl_ca_file = {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_messaging_rabbit.ssl_cert_file = {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_messaging_rabbit.ssl_enforce_fips_mode = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_messaging_rabbit.ssl_key_file = {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_messaging_rabbit.ssl_version = {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_messaging_notifications.driver = ['messagingv2'] {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_messaging_notifications.retry = -1 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_messaging_notifications.topics = ['notifications'] {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_messaging_notifications.transport_url = **** {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_limit.auth_section = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_limit.auth_type = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_limit.cafile = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_limit.certfile = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_limit.collect_timing = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_limit.connect_retries = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_limit.connect_retry_delay = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_limit.endpoint_id = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_limit.endpoint_override = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_limit.insecure = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_limit.keyfile = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_limit.max_version = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_limit.min_version = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_limit.region_name = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_limit.service_name = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_limit.service_type = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_limit.split_loggers = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_limit.status_code_retries = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_limit.status_code_retry_delay = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_limit.timeout = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_limit.valid_interfaces = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_limit.version = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_reports.file_event_handler = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_reports.file_event_handler_interval = 1 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] oslo_reports.log_dir = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] vif_plug_linux_bridge_privileged.capabilities = [12] {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] vif_plug_linux_bridge_privileged.group = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] vif_plug_linux_bridge_privileged.helper_command = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] vif_plug_linux_bridge_privileged.thread_pool_size = 12 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] vif_plug_linux_bridge_privileged.user = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] vif_plug_ovs_privileged.capabilities = [12, 1] {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] vif_plug_ovs_privileged.group = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] vif_plug_ovs_privileged.helper_command = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] vif_plug_ovs_privileged.thread_pool_size = 12 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] vif_plug_ovs_privileged.user = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] os_vif_linux_bridge.flat_interface = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] os_vif_linux_bridge.forward_bridge_interface = ['all'] {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] os_vif_linux_bridge.iptables_bottom_regex = {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] os_vif_linux_bridge.iptables_drop_action = DROP {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] os_vif_linux_bridge.iptables_top_regex = {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] os_vif_linux_bridge.network_device_mtu = 1500 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] os_vif_linux_bridge.use_ipv6 = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] os_vif_linux_bridge.vlan_interface = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] os_vif_ovs.isolate_vif = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] os_vif_ovs.network_device_mtu = 1500 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] os_vif_ovs.ovs_vsctl_timeout = 120 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] os_vif_ovs.ovsdb_interface = native {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] os_vif_ovs.per_port_bridge = False {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] os_brick.lock_path = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] privsep_osbrick.capabilities = [21] {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] privsep_osbrick.group = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] privsep_osbrick.helper_command = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] privsep_osbrick.logger_name = os_brick.privileged {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] privsep_osbrick.thread_pool_size = 12 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] privsep_osbrick.user = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] nova_sys_admin.group = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] nova_sys_admin.helper_command = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] nova_sys_admin.logger_name = oslo_privsep.daemon {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] nova_sys_admin.thread_pool_size = 12 {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] nova_sys_admin.user = None {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG oslo_service.service [None req-2498d96d-de77-4124-a0f4-1034576ca387 None None] ******************************************************************************** {{(pid=71474) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2613}} Apr 21 13:53:48 user nova-compute[71474]: INFO nova.service [-] Starting compute node (version 0.0.0) Apr 21 13:53:48 user nova-compute[71474]: DEBUG nova.virt.libvirt.host [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Starting native event thread {{(pid=71474) _init_events /opt/stack/nova/nova/virt/libvirt/host.py:492}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG nova.virt.libvirt.host [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Starting green dispatch thread {{(pid=71474) _init_events /opt/stack/nova/nova/virt/libvirt/host.py:498}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG nova.virt.libvirt.host [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Starting connection event dispatch thread {{(pid=71474) initialize /opt/stack/nova/nova/virt/libvirt/host.py:620}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG nova.virt.libvirt.host [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Connecting to libvirt: qemu:///system {{(pid=71474) _get_new_connection /opt/stack/nova/nova/virt/libvirt/host.py:503}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG nova.virt.libvirt.host [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Registering for lifecycle events {{(pid=71474) _get_new_connection /opt/stack/nova/nova/virt/libvirt/host.py:509}} Apr 21 13:53:48 user nova-compute[71474]: DEBUG nova.virt.libvirt.host [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Registering for connection events: {{(pid=71474) _get_new_connection /opt/stack/nova/nova/virt/libvirt/host.py:530}} Apr 21 13:53:48 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Connection event '1' reason 'None' Apr 21 13:53:48 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Cannot update service status on host "user" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host user could not be found. Apr 21 13:53:48 user nova-compute[71474]: DEBUG nova.virt.libvirt.volume.mount [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Initialising _HostMountState generation 0 {{(pid=71474) host_up /opt/stack/nova/nova/virt/libvirt/volume/mount.py:130}} Apr 21 13:53:55 user nova-compute[71474]: INFO nova.virt.libvirt.host [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Libvirt host capabilities Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: e20c3142-5af9-7467-ecd8-70b2e4a210d6 Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: x86_64 Apr 21 13:53:55 user nova-compute[71474]: IvyBridge-IBRS Apr 21 13:53:55 user nova-compute[71474]: Intel Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: tcp Apr 21 13:53:55 user nova-compute[71474]: rdma Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: 8152920 Apr 21 13:53:55 user nova-compute[71474]: 2038230 Apr 21 13:53:55 user nova-compute[71474]: 0 Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: 8255100 Apr 21 13:53:55 user nova-compute[71474]: 2063775 Apr 21 13:53:55 user nova-compute[71474]: 0 Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: apparmor Apr 21 13:53:55 user nova-compute[71474]: 0 Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: dac Apr 21 13:53:55 user nova-compute[71474]: 0 Apr 21 13:53:55 user nova-compute[71474]: +64055:+108 Apr 21 13:53:55 user nova-compute[71474]: +64055:+108 Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: hvm Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: 64 Apr 21 13:53:55 user nova-compute[71474]: /usr/bin/qemu-system-alpha Apr 21 13:53:55 user nova-compute[71474]: clipper Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: hvm Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: 32 Apr 21 13:53:55 user nova-compute[71474]: /usr/bin/qemu-system-arm Apr 21 13:53:55 user nova-compute[71474]: integratorcp Apr 21 13:53:55 user nova-compute[71474]: ast2600-evb Apr 21 13:53:55 user nova-compute[71474]: borzoi Apr 21 13:53:55 user nova-compute[71474]: spitz Apr 21 13:53:55 user nova-compute[71474]: virt-2.7 Apr 21 13:53:55 user nova-compute[71474]: nuri Apr 21 13:53:55 user nova-compute[71474]: mcimx7d-sabre Apr 21 13:53:55 user nova-compute[71474]: romulus-bmc Apr 21 13:53:55 user nova-compute[71474]: virt-3.0 Apr 21 13:53:55 user nova-compute[71474]: virt-5.0 Apr 21 13:53:55 user nova-compute[71474]: npcm750-evb Apr 21 13:53:55 user nova-compute[71474]: virt-2.10 Apr 21 13:53:55 user nova-compute[71474]: rainier-bmc Apr 21 13:53:55 user nova-compute[71474]: mps3-an547 Apr 21 13:53:55 user nova-compute[71474]: musca-b1 Apr 21 13:53:55 user nova-compute[71474]: realview-pbx-a9 Apr 21 13:53:55 user nova-compute[71474]: versatileab Apr 21 13:53:55 user nova-compute[71474]: kzm Apr 21 13:53:55 user nova-compute[71474]: virt-2.8 Apr 21 13:53:55 user nova-compute[71474]: musca-a Apr 21 13:53:55 user nova-compute[71474]: virt-3.1 Apr 21 13:53:55 user nova-compute[71474]: mcimx6ul-evk Apr 21 13:53:55 user nova-compute[71474]: virt-5.1 Apr 21 13:53:55 user nova-compute[71474]: smdkc210 Apr 21 13:53:55 user nova-compute[71474]: sx1 Apr 21 13:53:55 user nova-compute[71474]: virt-2.11 Apr 21 13:53:55 user nova-compute[71474]: imx25-pdk Apr 21 13:53:55 user nova-compute[71474]: stm32vldiscovery Apr 21 13:53:55 user nova-compute[71474]: virt-2.9 Apr 21 13:53:55 user nova-compute[71474]: orangepi-pc Apr 21 13:53:55 user nova-compute[71474]: quanta-q71l-bmc Apr 21 13:53:55 user nova-compute[71474]: z2 Apr 21 13:53:55 user nova-compute[71474]: virt-5.2 Apr 21 13:53:55 user nova-compute[71474]: xilinx-zynq-a9 Apr 21 13:53:55 user nova-compute[71474]: tosa Apr 21 13:53:55 user nova-compute[71474]: mps2-an500 Apr 21 13:53:55 user nova-compute[71474]: virt-2.12 Apr 21 13:53:55 user nova-compute[71474]: mps2-an521 Apr 21 13:53:55 user nova-compute[71474]: sabrelite Apr 21 13:53:55 user nova-compute[71474]: mps2-an511 Apr 21 13:53:55 user nova-compute[71474]: canon-a1100 Apr 21 13:53:55 user nova-compute[71474]: realview-eb Apr 21 13:53:55 user nova-compute[71474]: quanta-gbs-bmc Apr 21 13:53:55 user nova-compute[71474]: emcraft-sf2 Apr 21 13:53:55 user nova-compute[71474]: realview-pb-a8 Apr 21 13:53:55 user nova-compute[71474]: virt-4.0 Apr 21 13:53:55 user nova-compute[71474]: raspi1ap Apr 21 13:53:55 user nova-compute[71474]: palmetto-bmc Apr 21 13:53:55 user nova-compute[71474]: sx1-v1 Apr 21 13:53:55 user nova-compute[71474]: n810 Apr 21 13:53:55 user nova-compute[71474]: g220a-bmc Apr 21 13:53:55 user nova-compute[71474]: n800 Apr 21 13:53:55 user nova-compute[71474]: tacoma-bmc Apr 21 13:53:55 user nova-compute[71474]: virt-4.1 Apr 21 13:53:55 user nova-compute[71474]: quanta-gsj Apr 21 13:53:55 user nova-compute[71474]: versatilepb Apr 21 13:53:55 user nova-compute[71474]: terrier Apr 21 13:53:55 user nova-compute[71474]: mainstone Apr 21 13:53:55 user nova-compute[71474]: realview-eb-mpcore Apr 21 13:53:55 user nova-compute[71474]: supermicrox11-bmc Apr 21 13:53:55 user nova-compute[71474]: virt-4.2 Apr 21 13:53:55 user nova-compute[71474]: witherspoon-bmc Apr 21 13:53:55 user nova-compute[71474]: mps3-an524 Apr 21 13:53:55 user nova-compute[71474]: swift-bmc Apr 21 13:53:55 user nova-compute[71474]: kudo-bmc Apr 21 13:53:55 user nova-compute[71474]: vexpress-a9 Apr 21 13:53:55 user nova-compute[71474]: midway Apr 21 13:53:55 user nova-compute[71474]: musicpal Apr 21 13:53:55 user nova-compute[71474]: lm3s811evb Apr 21 13:53:55 user nova-compute[71474]: lm3s6965evb Apr 21 13:53:55 user nova-compute[71474]: microbit Apr 21 13:53:55 user nova-compute[71474]: mps2-an505 Apr 21 13:53:55 user nova-compute[71474]: mps2-an385 Apr 21 13:53:55 user nova-compute[71474]: virt-6.0 Apr 21 13:53:55 user nova-compute[71474]: cubieboard Apr 21 13:53:55 user nova-compute[71474]: verdex Apr 21 13:53:55 user nova-compute[71474]: netduino2 Apr 21 13:53:55 user nova-compute[71474]: mps2-an386 Apr 21 13:53:55 user nova-compute[71474]: virt-6.1 Apr 21 13:53:55 user nova-compute[71474]: raspi2b Apr 21 13:53:55 user nova-compute[71474]: vexpress-a15 Apr 21 13:53:55 user nova-compute[71474]: fuji-bmc Apr 21 13:53:55 user nova-compute[71474]: virt-6.2 Apr 21 13:53:55 user nova-compute[71474]: virt Apr 21 13:53:55 user nova-compute[71474]: sonorapass-bmc Apr 21 13:53:55 user nova-compute[71474]: cheetah Apr 21 13:53:55 user nova-compute[71474]: virt-2.6 Apr 21 13:53:55 user nova-compute[71474]: ast2500-evb Apr 21 13:53:55 user nova-compute[71474]: highbank Apr 21 13:53:55 user nova-compute[71474]: akita Apr 21 13:53:55 user nova-compute[71474]: connex Apr 21 13:53:55 user nova-compute[71474]: netduinoplus2 Apr 21 13:53:55 user nova-compute[71474]: collie Apr 21 13:53:55 user nova-compute[71474]: raspi0 Apr 21 13:53:55 user nova-compute[71474]: fp5280g2-bmc Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: hvm Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: 32 Apr 21 13:53:55 user nova-compute[71474]: /usr/bin/qemu-system-arm Apr 21 13:53:55 user nova-compute[71474]: integratorcp Apr 21 13:53:55 user nova-compute[71474]: ast2600-evb Apr 21 13:53:55 user nova-compute[71474]: borzoi Apr 21 13:53:55 user nova-compute[71474]: spitz Apr 21 13:53:55 user nova-compute[71474]: virt-2.7 Apr 21 13:53:55 user nova-compute[71474]: nuri Apr 21 13:53:55 user nova-compute[71474]: mcimx7d-sabre Apr 21 13:53:55 user nova-compute[71474]: romulus-bmc Apr 21 13:53:55 user nova-compute[71474]: virt-3.0 Apr 21 13:53:55 user nova-compute[71474]: virt-5.0 Apr 21 13:53:55 user nova-compute[71474]: npcm750-evb Apr 21 13:53:55 user nova-compute[71474]: virt-2.10 Apr 21 13:53:55 user nova-compute[71474]: rainier-bmc Apr 21 13:53:55 user nova-compute[71474]: mps3-an547 Apr 21 13:53:55 user nova-compute[71474]: musca-b1 Apr 21 13:53:55 user nova-compute[71474]: realview-pbx-a9 Apr 21 13:53:55 user nova-compute[71474]: versatileab Apr 21 13:53:55 user nova-compute[71474]: kzm Apr 21 13:53:55 user nova-compute[71474]: virt-2.8 Apr 21 13:53:55 user nova-compute[71474]: musca-a Apr 21 13:53:55 user nova-compute[71474]: virt-3.1 Apr 21 13:53:55 user nova-compute[71474]: mcimx6ul-evk Apr 21 13:53:55 user nova-compute[71474]: virt-5.1 Apr 21 13:53:55 user nova-compute[71474]: smdkc210 Apr 21 13:53:55 user nova-compute[71474]: sx1 Apr 21 13:53:55 user nova-compute[71474]: virt-2.11 Apr 21 13:53:55 user nova-compute[71474]: imx25-pdk Apr 21 13:53:55 user nova-compute[71474]: stm32vldiscovery Apr 21 13:53:55 user nova-compute[71474]: virt-2.9 Apr 21 13:53:55 user nova-compute[71474]: orangepi-pc Apr 21 13:53:55 user nova-compute[71474]: quanta-q71l-bmc Apr 21 13:53:55 user nova-compute[71474]: z2 Apr 21 13:53:55 user nova-compute[71474]: virt-5.2 Apr 21 13:53:55 user nova-compute[71474]: xilinx-zynq-a9 Apr 21 13:53:55 user nova-compute[71474]: tosa Apr 21 13:53:55 user nova-compute[71474]: mps2-an500 Apr 21 13:53:55 user nova-compute[71474]: virt-2.12 Apr 21 13:53:55 user nova-compute[71474]: mps2-an521 Apr 21 13:53:55 user nova-compute[71474]: sabrelite Apr 21 13:53:55 user nova-compute[71474]: mps2-an511 Apr 21 13:53:55 user nova-compute[71474]: canon-a1100 Apr 21 13:53:55 user nova-compute[71474]: realview-eb Apr 21 13:53:55 user nova-compute[71474]: quanta-gbs-bmc Apr 21 13:53:55 user nova-compute[71474]: emcraft-sf2 Apr 21 13:53:55 user nova-compute[71474]: realview-pb-a8 Apr 21 13:53:55 user nova-compute[71474]: virt-4.0 Apr 21 13:53:55 user nova-compute[71474]: raspi1ap Apr 21 13:53:55 user nova-compute[71474]: palmetto-bmc Apr 21 13:53:55 user nova-compute[71474]: sx1-v1 Apr 21 13:53:55 user nova-compute[71474]: n810 Apr 21 13:53:55 user nova-compute[71474]: g220a-bmc Apr 21 13:53:55 user nova-compute[71474]: n800 Apr 21 13:53:55 user nova-compute[71474]: tacoma-bmc Apr 21 13:53:55 user nova-compute[71474]: virt-4.1 Apr 21 13:53:55 user nova-compute[71474]: quanta-gsj Apr 21 13:53:55 user nova-compute[71474]: versatilepb Apr 21 13:53:55 user nova-compute[71474]: terrier Apr 21 13:53:55 user nova-compute[71474]: mainstone Apr 21 13:53:55 user nova-compute[71474]: realview-eb-mpcore Apr 21 13:53:55 user nova-compute[71474]: supermicrox11-bmc Apr 21 13:53:55 user nova-compute[71474]: virt-4.2 Apr 21 13:53:55 user nova-compute[71474]: witherspoon-bmc Apr 21 13:53:55 user nova-compute[71474]: mps3-an524 Apr 21 13:53:55 user nova-compute[71474]: swift-bmc Apr 21 13:53:55 user nova-compute[71474]: kudo-bmc Apr 21 13:53:55 user nova-compute[71474]: vexpress-a9 Apr 21 13:53:55 user nova-compute[71474]: midway Apr 21 13:53:55 user nova-compute[71474]: musicpal Apr 21 13:53:55 user nova-compute[71474]: lm3s811evb Apr 21 13:53:55 user nova-compute[71474]: lm3s6965evb Apr 21 13:53:55 user nova-compute[71474]: microbit Apr 21 13:53:55 user nova-compute[71474]: mps2-an505 Apr 21 13:53:55 user nova-compute[71474]: mps2-an385 Apr 21 13:53:55 user nova-compute[71474]: virt-6.0 Apr 21 13:53:55 user nova-compute[71474]: cubieboard Apr 21 13:53:55 user nova-compute[71474]: verdex Apr 21 13:53:55 user nova-compute[71474]: netduino2 Apr 21 13:53:55 user nova-compute[71474]: mps2-an386 Apr 21 13:53:55 user nova-compute[71474]: virt-6.1 Apr 21 13:53:55 user nova-compute[71474]: raspi2b Apr 21 13:53:55 user nova-compute[71474]: vexpress-a15 Apr 21 13:53:55 user nova-compute[71474]: fuji-bmc Apr 21 13:53:55 user nova-compute[71474]: virt-6.2 Apr 21 13:53:55 user nova-compute[71474]: virt Apr 21 13:53:55 user nova-compute[71474]: sonorapass-bmc Apr 21 13:53:55 user nova-compute[71474]: cheetah Apr 21 13:53:55 user nova-compute[71474]: virt-2.6 Apr 21 13:53:55 user nova-compute[71474]: ast2500-evb Apr 21 13:53:55 user nova-compute[71474]: highbank Apr 21 13:53:55 user nova-compute[71474]: akita Apr 21 13:53:55 user nova-compute[71474]: connex Apr 21 13:53:55 user nova-compute[71474]: netduinoplus2 Apr 21 13:53:55 user nova-compute[71474]: collie Apr 21 13:53:55 user nova-compute[71474]: raspi0 Apr 21 13:53:55 user nova-compute[71474]: fp5280g2-bmc Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: hvm Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: 64 Apr 21 13:53:55 user nova-compute[71474]: /usr/bin/qemu-system-aarch64 Apr 21 13:53:55 user nova-compute[71474]: integratorcp Apr 21 13:53:55 user nova-compute[71474]: ast2600-evb Apr 21 13:53:55 user nova-compute[71474]: borzoi Apr 21 13:53:55 user nova-compute[71474]: spitz Apr 21 13:53:55 user nova-compute[71474]: virt-2.7 Apr 21 13:53:55 user nova-compute[71474]: nuri Apr 21 13:53:55 user nova-compute[71474]: mcimx7d-sabre Apr 21 13:53:55 user nova-compute[71474]: romulus-bmc Apr 21 13:53:55 user nova-compute[71474]: virt-3.0 Apr 21 13:53:55 user nova-compute[71474]: virt-5.0 Apr 21 13:53:55 user nova-compute[71474]: npcm750-evb Apr 21 13:53:55 user nova-compute[71474]: virt-2.10 Apr 21 13:53:55 user nova-compute[71474]: rainier-bmc Apr 21 13:53:55 user nova-compute[71474]: mps3-an547 Apr 21 13:53:55 user nova-compute[71474]: virt-2.8 Apr 21 13:53:55 user nova-compute[71474]: musca-b1 Apr 21 13:53:55 user nova-compute[71474]: realview-pbx-a9 Apr 21 13:53:55 user nova-compute[71474]: versatileab Apr 21 13:53:55 user nova-compute[71474]: kzm Apr 21 13:53:55 user nova-compute[71474]: musca-a Apr 21 13:53:55 user nova-compute[71474]: virt-3.1 Apr 21 13:53:55 user nova-compute[71474]: mcimx6ul-evk Apr 21 13:53:55 user nova-compute[71474]: virt-5.1 Apr 21 13:53:55 user nova-compute[71474]: smdkc210 Apr 21 13:53:55 user nova-compute[71474]: sx1 Apr 21 13:53:55 user nova-compute[71474]: virt-2.11 Apr 21 13:53:55 user nova-compute[71474]: imx25-pdk Apr 21 13:53:55 user nova-compute[71474]: stm32vldiscovery Apr 21 13:53:55 user nova-compute[71474]: virt-2.9 Apr 21 13:53:55 user nova-compute[71474]: orangepi-pc Apr 21 13:53:55 user nova-compute[71474]: quanta-q71l-bmc Apr 21 13:53:55 user nova-compute[71474]: z2 Apr 21 13:53:55 user nova-compute[71474]: virt-5.2 Apr 21 13:53:55 user nova-compute[71474]: xilinx-zynq-a9 Apr 21 13:53:55 user nova-compute[71474]: xlnx-zcu102 Apr 21 13:53:55 user nova-compute[71474]: tosa Apr 21 13:53:55 user nova-compute[71474]: mps2-an500 Apr 21 13:53:55 user nova-compute[71474]: virt-2.12 Apr 21 13:53:55 user nova-compute[71474]: mps2-an521 Apr 21 13:53:55 user nova-compute[71474]: sabrelite Apr 21 13:53:55 user nova-compute[71474]: mps2-an511 Apr 21 13:53:55 user nova-compute[71474]: canon-a1100 Apr 21 13:53:55 user nova-compute[71474]: realview-eb Apr 21 13:53:55 user nova-compute[71474]: quanta-gbs-bmc Apr 21 13:53:55 user nova-compute[71474]: emcraft-sf2 Apr 21 13:53:55 user nova-compute[71474]: realview-pb-a8 Apr 21 13:53:55 user nova-compute[71474]: sbsa-ref Apr 21 13:53:55 user nova-compute[71474]: virt-4.0 Apr 21 13:53:55 user nova-compute[71474]: raspi1ap Apr 21 13:53:55 user nova-compute[71474]: palmetto-bmc Apr 21 13:53:55 user nova-compute[71474]: sx1-v1 Apr 21 13:53:55 user nova-compute[71474]: n810 Apr 21 13:53:55 user nova-compute[71474]: g220a-bmc Apr 21 13:53:55 user nova-compute[71474]: n800 Apr 21 13:53:55 user nova-compute[71474]: tacoma-bmc Apr 21 13:53:55 user nova-compute[71474]: virt-4.1 Apr 21 13:53:55 user nova-compute[71474]: quanta-gsj Apr 21 13:53:55 user nova-compute[71474]: versatilepb Apr 21 13:53:55 user nova-compute[71474]: terrier Apr 21 13:53:55 user nova-compute[71474]: mainstone Apr 21 13:53:55 user nova-compute[71474]: realview-eb-mpcore Apr 21 13:53:55 user nova-compute[71474]: supermicrox11-bmc Apr 21 13:53:55 user nova-compute[71474]: virt-4.2 Apr 21 13:53:55 user nova-compute[71474]: witherspoon-bmc Apr 21 13:53:55 user nova-compute[71474]: mps3-an524 Apr 21 13:53:55 user nova-compute[71474]: swift-bmc Apr 21 13:53:55 user nova-compute[71474]: kudo-bmc Apr 21 13:53:55 user nova-compute[71474]: vexpress-a9 Apr 21 13:53:55 user nova-compute[71474]: midway Apr 21 13:53:55 user nova-compute[71474]: musicpal Apr 21 13:53:55 user nova-compute[71474]: lm3s811evb Apr 21 13:53:55 user nova-compute[71474]: lm3s6965evb Apr 21 13:53:55 user nova-compute[71474]: microbit Apr 21 13:53:55 user nova-compute[71474]: mps2-an505 Apr 21 13:53:55 user nova-compute[71474]: mps2-an385 Apr 21 13:53:55 user nova-compute[71474]: virt-6.0 Apr 21 13:53:55 user nova-compute[71474]: raspi3ap Apr 21 13:53:55 user nova-compute[71474]: cubieboard Apr 21 13:53:55 user nova-compute[71474]: verdex Apr 21 13:53:55 user nova-compute[71474]: netduino2 Apr 21 13:53:55 user nova-compute[71474]: xlnx-versal-virt Apr 21 13:53:55 user nova-compute[71474]: mps2-an386 Apr 21 13:53:55 user nova-compute[71474]: virt-6.1 Apr 21 13:53:55 user nova-compute[71474]: raspi3b Apr 21 13:53:55 user nova-compute[71474]: raspi2b Apr 21 13:53:55 user nova-compute[71474]: vexpress-a15 Apr 21 13:53:55 user nova-compute[71474]: fuji-bmc Apr 21 13:53:55 user nova-compute[71474]: virt-6.2 Apr 21 13:53:55 user nova-compute[71474]: virt Apr 21 13:53:55 user nova-compute[71474]: sonorapass-bmc Apr 21 13:53:55 user nova-compute[71474]: cheetah Apr 21 13:53:55 user nova-compute[71474]: virt-2.6 Apr 21 13:53:55 user nova-compute[71474]: ast2500-evb Apr 21 13:53:55 user nova-compute[71474]: highbank Apr 21 13:53:55 user nova-compute[71474]: akita Apr 21 13:53:55 user nova-compute[71474]: connex Apr 21 13:53:55 user nova-compute[71474]: netduinoplus2 Apr 21 13:53:55 user nova-compute[71474]: collie Apr 21 13:53:55 user nova-compute[71474]: raspi0 Apr 21 13:53:55 user nova-compute[71474]: fp5280g2-bmc Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: hvm Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: 32 Apr 21 13:53:55 user nova-compute[71474]: /usr/bin/qemu-system-cris Apr 21 13:53:55 user nova-compute[71474]: axis-dev88 Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: hvm Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: 32 Apr 21 13:53:55 user nova-compute[71474]: /usr/bin/qemu-system-i386 Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-jammy Apr 21 13:53:55 user nova-compute[71474]: ubuntu Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-impish-hpb Apr 21 13:53:55 user nova-compute[71474]: pc-q35-5.2 Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-2.12 Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-2.0 Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-xenial Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-6.2 Apr 21 13:53:55 user nova-compute[71474]: pc Apr 21 13:53:55 user nova-compute[71474]: pc-q35-4.2 Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-2.5 Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-4.2 Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-focal Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-hirsute Apr 21 13:53:55 user nova-compute[71474]: pc-q35-xenial Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-jammy-hpb Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-5.2 Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-1.5 Apr 21 13:53:55 user nova-compute[71474]: pc-q35-2.7 Apr 21 13:53:55 user nova-compute[71474]: pc-q35-eoan-hpb Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-zesty Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-disco-hpb Apr 21 13:53:55 user nova-compute[71474]: pc-q35-groovy Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-groovy Apr 21 13:53:55 user nova-compute[71474]: pc-q35-artful Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-2.2 Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-trusty Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-eoan-hpb Apr 21 13:53:55 user nova-compute[71474]: pc-q35-focal-hpb Apr 21 13:53:55 user nova-compute[71474]: pc-q35-bionic-hpb Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-artful Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-2.7 Apr 21 13:53:55 user nova-compute[71474]: pc-q35-6.1 Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-yakkety Apr 21 13:53:55 user nova-compute[71474]: pc-q35-2.4 Apr 21 13:53:55 user nova-compute[71474]: pc-q35-cosmic-hpb Apr 21 13:53:55 user nova-compute[71474]: pc-q35-2.10 Apr 21 13:53:55 user nova-compute[71474]: x-remote Apr 21 13:53:55 user nova-compute[71474]: pc-q35-5.1 Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-1.7 Apr 21 13:53:55 user nova-compute[71474]: pc-q35-2.9 Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-2.11 Apr 21 13:53:55 user nova-compute[71474]: pc-q35-3.1 Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-6.1 Apr 21 13:53:55 user nova-compute[71474]: pc-q35-4.1 Apr 21 13:53:55 user nova-compute[71474]: pc-q35-jammy Apr 21 13:53:55 user nova-compute[71474]: ubuntu-q35 Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-2.4 Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-4.1 Apr 21 13:53:55 user nova-compute[71474]: pc-q35-eoan Apr 21 13:53:55 user nova-compute[71474]: pc-q35-jammy-hpb Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-5.1 Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-2.9 Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-bionic-hpb Apr 21 13:53:55 user nova-compute[71474]: isapc Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-1.4 Apr 21 13:53:55 user nova-compute[71474]: pc-q35-cosmic Apr 21 13:53:55 user nova-compute[71474]: pc-q35-2.6 Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-3.1 Apr 21 13:53:55 user nova-compute[71474]: pc-q35-bionic Apr 21 13:53:55 user nova-compute[71474]: pc-q35-disco-hpb Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-cosmic Apr 21 13:53:55 user nova-compute[71474]: pc-q35-2.12 Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-bionic Apr 21 13:53:55 user nova-compute[71474]: pc-q35-groovy-hpb Apr 21 13:53:55 user nova-compute[71474]: pc-q35-disco Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-cosmic-hpb Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-2.1 Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-wily Apr 21 13:53:55 user nova-compute[71474]: pc-q35-impish Apr 21 13:53:55 user nova-compute[71474]: pc-q35-6.0 Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-impish Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-2.6 Apr 21 13:53:55 user nova-compute[71474]: pc-q35-impish-hpb Apr 21 13:53:55 user nova-compute[71474]: pc-q35-hirsute Apr 21 13:53:55 user nova-compute[71474]: pc-q35-4.0.1 Apr 21 13:53:55 user nova-compute[71474]: pc-q35-hirsute-hpb Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-1.6 Apr 21 13:53:55 user nova-compute[71474]: pc-q35-5.0 Apr 21 13:53:55 user nova-compute[71474]: pc-q35-2.8 Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-2.10 Apr 21 13:53:55 user nova-compute[71474]: pc-q35-3.0 Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-6.0 Apr 21 13:53:55 user nova-compute[71474]: pc-q35-zesty Apr 21 13:53:55 user nova-compute[71474]: pc-q35-4.0 Apr 21 13:53:55 user nova-compute[71474]: pc-q35-focal Apr 21 13:53:55 user nova-compute[71474]: microvm Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-2.3 Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-focal-hpb Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-disco Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-4.0 Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-groovy-hpb Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-hirsute-hpb Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-5.0 Apr 21 13:53:55 user nova-compute[71474]: pc-q35-6.2 Apr 21 13:53:55 user nova-compute[71474]: q35 Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-2.8 Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-eoan Apr 21 13:53:55 user nova-compute[71474]: pc-q35-2.5 Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-3.0 Apr 21 13:53:55 user nova-compute[71474]: pc-q35-yakkety Apr 21 13:53:55 user nova-compute[71474]: pc-q35-2.11 Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: hvm Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: 32 Apr 21 13:53:55 user nova-compute[71474]: /usr/bin/qemu-system-m68k Apr 21 13:53:55 user nova-compute[71474]: mcf5208evb Apr 21 13:53:55 user nova-compute[71474]: an5206 Apr 21 13:53:55 user nova-compute[71474]: virt-6.0 Apr 21 13:53:55 user nova-compute[71474]: q800 Apr 21 13:53:55 user nova-compute[71474]: virt-6.2 Apr 21 13:53:55 user nova-compute[71474]: virt Apr 21 13:53:55 user nova-compute[71474]: next-cube Apr 21 13:53:55 user nova-compute[71474]: virt-6.1 Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: hvm Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: 32 Apr 21 13:53:55 user nova-compute[71474]: /usr/bin/qemu-system-microblaze Apr 21 13:53:55 user nova-compute[71474]: petalogix-s3adsp1800 Apr 21 13:53:55 user nova-compute[71474]: petalogix-ml605 Apr 21 13:53:55 user nova-compute[71474]: xlnx-zynqmp-pmu Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: hvm Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: 32 Apr 21 13:53:55 user nova-compute[71474]: /usr/bin/qemu-system-microblazeel Apr 21 13:53:55 user nova-compute[71474]: petalogix-s3adsp1800 Apr 21 13:53:55 user nova-compute[71474]: petalogix-ml605 Apr 21 13:53:55 user nova-compute[71474]: xlnx-zynqmp-pmu Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: hvm Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: 32 Apr 21 13:53:55 user nova-compute[71474]: /usr/bin/qemu-system-mips Apr 21 13:53:55 user nova-compute[71474]: malta Apr 21 13:53:55 user nova-compute[71474]: mipssim Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: hvm Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: 32 Apr 21 13:53:55 user nova-compute[71474]: /usr/bin/qemu-system-mipsel Apr 21 13:53:55 user nova-compute[71474]: malta Apr 21 13:53:55 user nova-compute[71474]: mipssim Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: hvm Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: 64 Apr 21 13:53:55 user nova-compute[71474]: /usr/bin/qemu-system-mips64 Apr 21 13:53:55 user nova-compute[71474]: malta Apr 21 13:53:55 user nova-compute[71474]: mipssim Apr 21 13:53:55 user nova-compute[71474]: pica61 Apr 21 13:53:55 user nova-compute[71474]: magnum Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: hvm Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: 64 Apr 21 13:53:55 user nova-compute[71474]: /usr/bin/qemu-system-mips64el Apr 21 13:53:55 user nova-compute[71474]: malta Apr 21 13:53:55 user nova-compute[71474]: loongson3-virt Apr 21 13:53:55 user nova-compute[71474]: mipssim Apr 21 13:53:55 user nova-compute[71474]: pica61 Apr 21 13:53:55 user nova-compute[71474]: magnum Apr 21 13:53:55 user nova-compute[71474]: boston Apr 21 13:53:55 user nova-compute[71474]: fuloong2e Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: hvm Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: 32 Apr 21 13:53:55 user nova-compute[71474]: /usr/bin/qemu-system-ppc Apr 21 13:53:55 user nova-compute[71474]: g3beige Apr 21 13:53:55 user nova-compute[71474]: virtex-ml507 Apr 21 13:53:55 user nova-compute[71474]: mac99 Apr 21 13:53:55 user nova-compute[71474]: ppce500 Apr 21 13:53:55 user nova-compute[71474]: pegasos2 Apr 21 13:53:55 user nova-compute[71474]: sam460ex Apr 21 13:53:55 user nova-compute[71474]: bamboo Apr 21 13:53:55 user nova-compute[71474]: 40p Apr 21 13:53:55 user nova-compute[71474]: ref405ep Apr 21 13:53:55 user nova-compute[71474]: mpc8544ds Apr 21 13:53:55 user nova-compute[71474]: taihu Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: hvm Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: 64 Apr 21 13:53:55 user nova-compute[71474]: /usr/bin/qemu-system-ppc64 Apr 21 13:53:55 user nova-compute[71474]: pseries-jammy Apr 21 13:53:55 user nova-compute[71474]: pseries Apr 21 13:53:55 user nova-compute[71474]: powernv9 Apr 21 13:53:55 user nova-compute[71474]: powernv Apr 21 13:53:55 user nova-compute[71474]: taihu Apr 21 13:53:55 user nova-compute[71474]: pseries-4.1 Apr 21 13:53:55 user nova-compute[71474]: mpc8544ds Apr 21 13:53:55 user nova-compute[71474]: pseries-6.1 Apr 21 13:53:55 user nova-compute[71474]: pseries-2.5 Apr 21 13:53:55 user nova-compute[71474]: powernv10 Apr 21 13:53:55 user nova-compute[71474]: pseries-xenial Apr 21 13:53:55 user nova-compute[71474]: pseries-4.2 Apr 21 13:53:55 user nova-compute[71474]: pseries-6.2 Apr 21 13:53:55 user nova-compute[71474]: pseries-yakkety Apr 21 13:53:55 user nova-compute[71474]: pseries-2.6 Apr 21 13:53:55 user nova-compute[71474]: ppce500 Apr 21 13:53:55 user nova-compute[71474]: pseries-bionic-sxxm Apr 21 13:53:55 user nova-compute[71474]: pseries-2.7 Apr 21 13:53:55 user nova-compute[71474]: pseries-3.0 Apr 21 13:53:55 user nova-compute[71474]: pseries-5.0 Apr 21 13:53:55 user nova-compute[71474]: 40p Apr 21 13:53:55 user nova-compute[71474]: pseries-2.8 Apr 21 13:53:55 user nova-compute[71474]: pegasos2 Apr 21 13:53:55 user nova-compute[71474]: pseries-hirsute Apr 21 13:53:55 user nova-compute[71474]: pseries-3.1 Apr 21 13:53:55 user nova-compute[71474]: pseries-5.1 Apr 21 13:53:55 user nova-compute[71474]: pseries-eoan Apr 21 13:53:55 user nova-compute[71474]: pseries-2.9 Apr 21 13:53:55 user nova-compute[71474]: pseries-zesty Apr 21 13:53:55 user nova-compute[71474]: bamboo Apr 21 13:53:55 user nova-compute[71474]: pseries-groovy Apr 21 13:53:55 user nova-compute[71474]: pseries-focal Apr 21 13:53:55 user nova-compute[71474]: g3beige Apr 21 13:53:55 user nova-compute[71474]: pseries-5.2 Apr 21 13:53:55 user nova-compute[71474]: pseries-disco Apr 21 13:53:55 user nova-compute[71474]: pseries-2.12-sxxm Apr 21 13:53:55 user nova-compute[71474]: pseries-2.10 Apr 21 13:53:55 user nova-compute[71474]: virtex-ml507 Apr 21 13:53:55 user nova-compute[71474]: pseries-2.11 Apr 21 13:53:55 user nova-compute[71474]: pseries-2.1 Apr 21 13:53:55 user nova-compute[71474]: pseries-cosmic Apr 21 13:53:55 user nova-compute[71474]: pseries-bionic Apr 21 13:53:55 user nova-compute[71474]: pseries-2.12 Apr 21 13:53:55 user nova-compute[71474]: pseries-2.2 Apr 21 13:53:55 user nova-compute[71474]: mac99 Apr 21 13:53:55 user nova-compute[71474]: pseries-impish Apr 21 13:53:55 user nova-compute[71474]: pseries-artful Apr 21 13:53:55 user nova-compute[71474]: sam460ex Apr 21 13:53:55 user nova-compute[71474]: ref405ep Apr 21 13:53:55 user nova-compute[71474]: pseries-2.3 Apr 21 13:53:55 user nova-compute[71474]: powernv8 Apr 21 13:53:55 user nova-compute[71474]: pseries-4.0 Apr 21 13:53:55 user nova-compute[71474]: pseries-6.0 Apr 21 13:53:55 user nova-compute[71474]: pseries-2.4 Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: hvm Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: 64 Apr 21 13:53:55 user nova-compute[71474]: /usr/bin/qemu-system-ppc64le Apr 21 13:53:55 user nova-compute[71474]: pseries-jammy Apr 21 13:53:55 user nova-compute[71474]: pseries Apr 21 13:53:55 user nova-compute[71474]: powernv9 Apr 21 13:53:55 user nova-compute[71474]: powernv Apr 21 13:53:55 user nova-compute[71474]: taihu Apr 21 13:53:55 user nova-compute[71474]: pseries-4.1 Apr 21 13:53:55 user nova-compute[71474]: mpc8544ds Apr 21 13:53:55 user nova-compute[71474]: pseries-6.1 Apr 21 13:53:55 user nova-compute[71474]: pseries-2.5 Apr 21 13:53:55 user nova-compute[71474]: powernv10 Apr 21 13:53:55 user nova-compute[71474]: pseries-xenial Apr 21 13:53:55 user nova-compute[71474]: pseries-4.2 Apr 21 13:53:55 user nova-compute[71474]: pseries-6.2 Apr 21 13:53:55 user nova-compute[71474]: pseries-yakkety Apr 21 13:53:55 user nova-compute[71474]: pseries-2.6 Apr 21 13:53:55 user nova-compute[71474]: ppce500 Apr 21 13:53:55 user nova-compute[71474]: pseries-bionic-sxxm Apr 21 13:53:55 user nova-compute[71474]: pseries-2.7 Apr 21 13:53:55 user nova-compute[71474]: pseries-3.0 Apr 21 13:53:55 user nova-compute[71474]: pseries-5.0 Apr 21 13:53:55 user nova-compute[71474]: 40p Apr 21 13:53:55 user nova-compute[71474]: pseries-2.8 Apr 21 13:53:55 user nova-compute[71474]: pegasos2 Apr 21 13:53:55 user nova-compute[71474]: pseries-hirsute Apr 21 13:53:55 user nova-compute[71474]: pseries-3.1 Apr 21 13:53:55 user nova-compute[71474]: pseries-5.1 Apr 21 13:53:55 user nova-compute[71474]: pseries-eoan Apr 21 13:53:55 user nova-compute[71474]: pseries-2.9 Apr 21 13:53:55 user nova-compute[71474]: pseries-zesty Apr 21 13:53:55 user nova-compute[71474]: bamboo Apr 21 13:53:55 user nova-compute[71474]: pseries-groovy Apr 21 13:53:55 user nova-compute[71474]: pseries-focal Apr 21 13:53:55 user nova-compute[71474]: g3beige Apr 21 13:53:55 user nova-compute[71474]: pseries-5.2 Apr 21 13:53:55 user nova-compute[71474]: pseries-disco Apr 21 13:53:55 user nova-compute[71474]: pseries-2.12-sxxm Apr 21 13:53:55 user nova-compute[71474]: pseries-2.10 Apr 21 13:53:55 user nova-compute[71474]: virtex-ml507 Apr 21 13:53:55 user nova-compute[71474]: pseries-2.11 Apr 21 13:53:55 user nova-compute[71474]: pseries-2.1 Apr 21 13:53:55 user nova-compute[71474]: pseries-cosmic Apr 21 13:53:55 user nova-compute[71474]: pseries-bionic Apr 21 13:53:55 user nova-compute[71474]: pseries-2.12 Apr 21 13:53:55 user nova-compute[71474]: pseries-2.2 Apr 21 13:53:55 user nova-compute[71474]: mac99 Apr 21 13:53:55 user nova-compute[71474]: pseries-impish Apr 21 13:53:55 user nova-compute[71474]: pseries-artful Apr 21 13:53:55 user nova-compute[71474]: sam460ex Apr 21 13:53:55 user nova-compute[71474]: ref405ep Apr 21 13:53:55 user nova-compute[71474]: pseries-2.3 Apr 21 13:53:55 user nova-compute[71474]: powernv8 Apr 21 13:53:55 user nova-compute[71474]: pseries-4.0 Apr 21 13:53:55 user nova-compute[71474]: pseries-6.0 Apr 21 13:53:55 user nova-compute[71474]: pseries-2.4 Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: hvm Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: 32 Apr 21 13:53:55 user nova-compute[71474]: /usr/bin/qemu-system-riscv32 Apr 21 13:53:55 user nova-compute[71474]: spike Apr 21 13:53:55 user nova-compute[71474]: opentitan Apr 21 13:53:55 user nova-compute[71474]: sifive_u Apr 21 13:53:55 user nova-compute[71474]: sifive_e Apr 21 13:53:55 user nova-compute[71474]: virt Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: hvm Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: 64 Apr 21 13:53:55 user nova-compute[71474]: /usr/bin/qemu-system-riscv64 Apr 21 13:53:55 user nova-compute[71474]: spike Apr 21 13:53:55 user nova-compute[71474]: microchip-icicle-kit Apr 21 13:53:55 user nova-compute[71474]: sifive_u Apr 21 13:53:55 user nova-compute[71474]: shakti_c Apr 21 13:53:55 user nova-compute[71474]: sifive_e Apr 21 13:53:55 user nova-compute[71474]: virt Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: hvm Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: 64 Apr 21 13:53:55 user nova-compute[71474]: /usr/bin/qemu-system-s390x Apr 21 13:53:55 user nova-compute[71474]: s390-ccw-virtio-jammy Apr 21 13:53:55 user nova-compute[71474]: s390-ccw-virtio Apr 21 13:53:55 user nova-compute[71474]: s390-ccw-virtio-4.0 Apr 21 13:53:55 user nova-compute[71474]: s390-ccw-virtio-5.2 Apr 21 13:53:55 user nova-compute[71474]: s390-ccw-virtio-artful Apr 21 13:53:55 user nova-compute[71474]: s390-ccw-virtio-3.1 Apr 21 13:53:55 user nova-compute[71474]: s390-ccw-virtio-groovy Apr 21 13:53:55 user nova-compute[71474]: s390-ccw-virtio-hirsute Apr 21 13:53:55 user nova-compute[71474]: s390-ccw-virtio-disco Apr 21 13:53:55 user nova-compute[71474]: s390-ccw-virtio-2.12 Apr 21 13:53:55 user nova-compute[71474]: s390-ccw-virtio-2.6 Apr 21 13:53:55 user nova-compute[71474]: s390-ccw-virtio-yakkety Apr 21 13:53:55 user nova-compute[71474]: s390-ccw-virtio-eoan Apr 21 13:53:55 user nova-compute[71474]: s390-ccw-virtio-2.9 Apr 21 13:53:55 user nova-compute[71474]: s390-ccw-virtio-6.0 Apr 21 13:53:55 user nova-compute[71474]: s390-ccw-virtio-5.1 Apr 21 13:53:55 user nova-compute[71474]: s390-ccw-virtio-3.0 Apr 21 13:53:55 user nova-compute[71474]: s390-ccw-virtio-4.2 Apr 21 13:53:55 user nova-compute[71474]: s390-ccw-virtio-2.5 Apr 21 13:53:55 user nova-compute[71474]: s390-ccw-virtio-2.11 Apr 21 13:53:55 user nova-compute[71474]: s390-ccw-virtio-xenial Apr 21 13:53:55 user nova-compute[71474]: s390-ccw-virtio-focal Apr 21 13:53:55 user nova-compute[71474]: s390-ccw-virtio-2.8 Apr 21 13:53:55 user nova-compute[71474]: s390-ccw-virtio-impish Apr 21 13:53:55 user nova-compute[71474]: s390-ccw-virtio-bionic Apr 21 13:53:55 user nova-compute[71474]: s390-ccw-virtio-5.0 Apr 21 13:53:55 user nova-compute[71474]: s390-ccw-virtio-6.2 Apr 21 13:53:55 user nova-compute[71474]: s390-ccw-virtio-zesty Apr 21 13:53:55 user nova-compute[71474]: s390-ccw-virtio-4.1 Apr 21 13:53:55 user nova-compute[71474]: s390-ccw-virtio-cosmic Apr 21 13:53:55 user nova-compute[71474]: s390-ccw-virtio-2.4 Apr 21 13:53:55 user nova-compute[71474]: s390-ccw-virtio-2.10 Apr 21 13:53:55 user nova-compute[71474]: s390-ccw-virtio-2.7 Apr 21 13:53:55 user nova-compute[71474]: s390-ccw-virtio-6.1 Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: hvm Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: 32 Apr 21 13:53:55 user nova-compute[71474]: /usr/bin/qemu-system-sh4 Apr 21 13:53:55 user nova-compute[71474]: shix Apr 21 13:53:55 user nova-compute[71474]: r2d Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: hvm Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: 64 Apr 21 13:53:55 user nova-compute[71474]: /usr/bin/qemu-system-sh4eb Apr 21 13:53:55 user nova-compute[71474]: shix Apr 21 13:53:55 user nova-compute[71474]: r2d Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: hvm Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: 32 Apr 21 13:53:55 user nova-compute[71474]: /usr/bin/qemu-system-sparc Apr 21 13:53:55 user nova-compute[71474]: SS-5 Apr 21 13:53:55 user nova-compute[71474]: SS-20 Apr 21 13:53:55 user nova-compute[71474]: LX Apr 21 13:53:55 user nova-compute[71474]: SPARCClassic Apr 21 13:53:55 user nova-compute[71474]: leon3_generic Apr 21 13:53:55 user nova-compute[71474]: SPARCbook Apr 21 13:53:55 user nova-compute[71474]: SS-4 Apr 21 13:53:55 user nova-compute[71474]: SS-600MP Apr 21 13:53:55 user nova-compute[71474]: SS-10 Apr 21 13:53:55 user nova-compute[71474]: Voyager Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: hvm Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: 64 Apr 21 13:53:55 user nova-compute[71474]: /usr/bin/qemu-system-sparc64 Apr 21 13:53:55 user nova-compute[71474]: sun4u Apr 21 13:53:55 user nova-compute[71474]: niagara Apr 21 13:53:55 user nova-compute[71474]: sun4v Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: hvm Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: 64 Apr 21 13:53:55 user nova-compute[71474]: /usr/bin/qemu-system-x86_64 Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-jammy Apr 21 13:53:55 user nova-compute[71474]: ubuntu Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-impish-hpb Apr 21 13:53:55 user nova-compute[71474]: pc-q35-5.2 Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-2.12 Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-2.0 Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-xenial Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-6.2 Apr 21 13:53:55 user nova-compute[71474]: pc Apr 21 13:53:55 user nova-compute[71474]: pc-q35-4.2 Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-2.5 Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-4.2 Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-hirsute Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-focal Apr 21 13:53:55 user nova-compute[71474]: pc-q35-xenial Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-jammy-hpb Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-5.2 Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-1.5 Apr 21 13:53:55 user nova-compute[71474]: pc-q35-2.7 Apr 21 13:53:55 user nova-compute[71474]: pc-q35-eoan-hpb Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-zesty Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-disco-hpb Apr 21 13:53:55 user nova-compute[71474]: pc-q35-groovy Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-groovy Apr 21 13:53:55 user nova-compute[71474]: pc-q35-artful Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-trusty Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-2.2 Apr 21 13:53:55 user nova-compute[71474]: pc-q35-focal-hpb Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-eoan-hpb Apr 21 13:53:55 user nova-compute[71474]: pc-q35-bionic-hpb Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-artful Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-2.7 Apr 21 13:53:55 user nova-compute[71474]: pc-q35-6.1 Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-yakkety Apr 21 13:53:55 user nova-compute[71474]: pc-q35-2.4 Apr 21 13:53:55 user nova-compute[71474]: pc-q35-cosmic-hpb Apr 21 13:53:55 user nova-compute[71474]: pc-q35-2.10 Apr 21 13:53:55 user nova-compute[71474]: x-remote Apr 21 13:53:55 user nova-compute[71474]: pc-q35-5.1 Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-1.7 Apr 21 13:53:55 user nova-compute[71474]: pc-q35-2.9 Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-2.11 Apr 21 13:53:55 user nova-compute[71474]: pc-q35-3.1 Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-6.1 Apr 21 13:53:55 user nova-compute[71474]: pc-q35-4.1 Apr 21 13:53:55 user nova-compute[71474]: pc-q35-jammy Apr 21 13:53:55 user nova-compute[71474]: ubuntu-q35 Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-2.4 Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-4.1 Apr 21 13:53:55 user nova-compute[71474]: pc-q35-eoan Apr 21 13:53:55 user nova-compute[71474]: pc-q35-jammy-hpb Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-5.1 Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-2.9 Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-bionic-hpb Apr 21 13:53:55 user nova-compute[71474]: isapc Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-1.4 Apr 21 13:53:55 user nova-compute[71474]: pc-q35-cosmic Apr 21 13:53:55 user nova-compute[71474]: pc-q35-2.6 Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-3.1 Apr 21 13:53:55 user nova-compute[71474]: pc-q35-bionic Apr 21 13:53:55 user nova-compute[71474]: pc-q35-disco-hpb Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-cosmic Apr 21 13:53:55 user nova-compute[71474]: pc-q35-2.12 Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-bionic Apr 21 13:53:55 user nova-compute[71474]: pc-q35-groovy-hpb Apr 21 13:53:55 user nova-compute[71474]: pc-q35-disco Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-cosmic-hpb Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-2.1 Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-wily Apr 21 13:53:55 user nova-compute[71474]: pc-q35-impish Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-2.6 Apr 21 13:53:55 user nova-compute[71474]: pc-q35-6.0 Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-impish Apr 21 13:53:55 user nova-compute[71474]: pc-q35-impish-hpb Apr 21 13:53:55 user nova-compute[71474]: pc-q35-hirsute Apr 21 13:53:55 user nova-compute[71474]: pc-q35-4.0.1 Apr 21 13:53:55 user nova-compute[71474]: pc-q35-hirsute-hpb Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-1.6 Apr 21 13:53:55 user nova-compute[71474]: pc-q35-5.0 Apr 21 13:53:55 user nova-compute[71474]: pc-q35-2.8 Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-2.10 Apr 21 13:53:55 user nova-compute[71474]: pc-q35-3.0 Apr 21 13:53:55 user nova-compute[71474]: pc-q35-zesty Apr 21 13:53:55 user nova-compute[71474]: pc-q35-4.0 Apr 21 13:53:55 user nova-compute[71474]: pc-q35-focal Apr 21 13:53:55 user nova-compute[71474]: microvm Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-6.0 Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-2.3 Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-disco Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-focal-hpb Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-4.0 Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-groovy-hpb Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-hirsute-hpb Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-5.0 Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-2.8 Apr 21 13:53:55 user nova-compute[71474]: pc-q35-6.2 Apr 21 13:53:55 user nova-compute[71474]: q35 Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-eoan Apr 21 13:53:55 user nova-compute[71474]: pc-q35-2.5 Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-3.0 Apr 21 13:53:55 user nova-compute[71474]: pc-q35-yakkety Apr 21 13:53:55 user nova-compute[71474]: pc-q35-2.11 Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: hvm Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: 32 Apr 21 13:53:55 user nova-compute[71474]: /usr/bin/qemu-system-xtensa Apr 21 13:53:55 user nova-compute[71474]: sim Apr 21 13:53:55 user nova-compute[71474]: kc705 Apr 21 13:53:55 user nova-compute[71474]: ml605 Apr 21 13:53:55 user nova-compute[71474]: ml605-nommu Apr 21 13:53:55 user nova-compute[71474]: virt Apr 21 13:53:55 user nova-compute[71474]: lx60-nommu Apr 21 13:53:55 user nova-compute[71474]: lx200 Apr 21 13:53:55 user nova-compute[71474]: lx200-nommu Apr 21 13:53:55 user nova-compute[71474]: lx60 Apr 21 13:53:55 user nova-compute[71474]: kc705-nommu Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: hvm Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: 32 Apr 21 13:53:55 user nova-compute[71474]: /usr/bin/qemu-system-xtensaeb Apr 21 13:53:55 user nova-compute[71474]: sim Apr 21 13:53:55 user nova-compute[71474]: kc705 Apr 21 13:53:55 user nova-compute[71474]: ml605 Apr 21 13:53:55 user nova-compute[71474]: ml605-nommu Apr 21 13:53:55 user nova-compute[71474]: virt Apr 21 13:53:55 user nova-compute[71474]: lx60-nommu Apr 21 13:53:55 user nova-compute[71474]: lx200 Apr 21 13:53:55 user nova-compute[71474]: lx200-nommu Apr 21 13:53:55 user nova-compute[71474]: lx60 Apr 21 13:53:55 user nova-compute[71474]: kc705-nommu Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: DEBUG nova.virt.libvirt.host [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Getting domain capabilities for alpha via machine types: {None} {{(pid=71474) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 21 13:53:55 user nova-compute[71474]: DEBUG nova.virt.libvirt.host [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Error from libvirt when retrieving domain capabilities for arch alpha / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-alpha' on this host {{(pid=71474) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 21 13:53:55 user nova-compute[71474]: DEBUG nova.virt.libvirt.host [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Getting domain capabilities for armv6l via machine types: {'virt', None} {{(pid=71474) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 21 13:53:55 user nova-compute[71474]: DEBUG nova.virt.libvirt.host [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Error from libvirt when retrieving domain capabilities for arch armv6l / virt_type kvm / machine_type virt: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-arm' on this host {{(pid=71474) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 21 13:53:55 user nova-compute[71474]: DEBUG nova.virt.libvirt.host [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Error from libvirt when retrieving domain capabilities for arch armv6l / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-arm' on this host {{(pid=71474) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 21 13:53:55 user nova-compute[71474]: DEBUG nova.virt.libvirt.host [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Getting domain capabilities for armv7l via machine types: {'virt'} {{(pid=71474) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 21 13:53:55 user nova-compute[71474]: DEBUG nova.virt.libvirt.host [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Error from libvirt when retrieving domain capabilities for arch armv7l / virt_type kvm / machine_type virt: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-arm' on this host {{(pid=71474) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 21 13:53:55 user nova-compute[71474]: DEBUG nova.virt.libvirt.host [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Getting domain capabilities for aarch64 via machine types: {'virt'} {{(pid=71474) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 21 13:53:55 user nova-compute[71474]: DEBUG nova.virt.libvirt.host [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Error from libvirt when retrieving domain capabilities for arch aarch64 / virt_type kvm / machine_type virt: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-aarch64' on this host {{(pid=71474) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 21 13:53:55 user nova-compute[71474]: DEBUG nova.virt.libvirt.host [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Getting domain capabilities for cris via machine types: {None} {{(pid=71474) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 21 13:53:55 user nova-compute[71474]: DEBUG nova.virt.libvirt.host [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Error from libvirt when retrieving domain capabilities for arch cris / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-cris' on this host {{(pid=71474) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 21 13:53:55 user nova-compute[71474]: DEBUG nova.virt.libvirt.host [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Getting domain capabilities for i686 via machine types: {'q35', 'ubuntu-q35', 'ubuntu', 'pc'} {{(pid=71474) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 21 13:53:55 user nova-compute[71474]: DEBUG nova.virt.libvirt.host [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: /usr/bin/qemu-system-i386 Apr 21 13:53:55 user nova-compute[71474]: kvm Apr 21 13:53:55 user nova-compute[71474]: pc-q35-6.2 Apr 21 13:53:55 user nova-compute[71474]: i686 Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: /usr/share/OVMF/OVMF_CODE.fd Apr 21 13:53:55 user nova-compute[71474]: /usr/share/OVMF/OVMF_CODE.secboot.fd Apr 21 13:53:55 user nova-compute[71474]: /usr/share/AAVMF/AAVMF_CODE.fd Apr 21 13:53:55 user nova-compute[71474]: /usr/share/AAVMF/AAVMF32_CODE.fd Apr 21 13:53:55 user nova-compute[71474]: /usr/share/OVMF/OVMF_CODE.ms.fd Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: rom Apr 21 13:53:55 user nova-compute[71474]: pflash Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: yes Apr 21 13:53:55 user nova-compute[71474]: no Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: no Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: off Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: on Apr 21 13:53:55 user nova-compute[71474]: off Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: IvyBridge-IBRS Apr 21 13:53:55 user nova-compute[71474]: Intel Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: qemu64 Apr 21 13:53:55 user nova-compute[71474]: qemu32 Apr 21 13:53:55 user nova-compute[71474]: phenom Apr 21 13:53:55 user nova-compute[71474]: pentium3 Apr 21 13:53:55 user nova-compute[71474]: pentium2 Apr 21 13:53:55 user nova-compute[71474]: pentium Apr 21 13:53:55 user nova-compute[71474]: n270 Apr 21 13:53:55 user nova-compute[71474]: kvm64 Apr 21 13:53:55 user nova-compute[71474]: kvm32 Apr 21 13:53:55 user nova-compute[71474]: coreduo Apr 21 13:53:55 user nova-compute[71474]: core2duo Apr 21 13:53:55 user nova-compute[71474]: athlon Apr 21 13:53:55 user nova-compute[71474]: Westmere-IBRS Apr 21 13:53:55 user nova-compute[71474]: Westmere Apr 21 13:53:55 user nova-compute[71474]: Snowridge Apr 21 13:53:55 user nova-compute[71474]: Skylake-Server-noTSX-IBRS Apr 21 13:53:55 user nova-compute[71474]: Skylake-Server-IBRS Apr 21 13:53:55 user nova-compute[71474]: Skylake-Server Apr 21 13:53:55 user nova-compute[71474]: Skylake-Client-noTSX-IBRS Apr 21 13:53:55 user nova-compute[71474]: Skylake-Client-IBRS Apr 21 13:53:55 user nova-compute[71474]: Skylake-Client Apr 21 13:53:55 user nova-compute[71474]: SandyBridge-IBRS Apr 21 13:53:55 user nova-compute[71474]: SandyBridge Apr 21 13:53:55 user nova-compute[71474]: Penryn Apr 21 13:53:55 user nova-compute[71474]: Opteron_G5 Apr 21 13:53:55 user nova-compute[71474]: Opteron_G4 Apr 21 13:53:55 user nova-compute[71474]: Opteron_G3 Apr 21 13:53:55 user nova-compute[71474]: Opteron_G2 Apr 21 13:53:55 user nova-compute[71474]: Opteron_G1 Apr 21 13:53:55 user nova-compute[71474]: Nehalem-IBRS Apr 21 13:53:55 user nova-compute[71474]: Nehalem Apr 21 13:53:55 user nova-compute[71474]: IvyBridge-IBRS Apr 21 13:53:55 user nova-compute[71474]: IvyBridge Apr 21 13:53:55 user nova-compute[71474]: Icelake-Server-noTSX Apr 21 13:53:55 user nova-compute[71474]: Icelake-Server Apr 21 13:53:55 user nova-compute[71474]: Icelake-Client-noTSX Apr 21 13:53:55 user nova-compute[71474]: Icelake-Client Apr 21 13:53:55 user nova-compute[71474]: Haswell-noTSX-IBRS Apr 21 13:53:55 user nova-compute[71474]: Haswell-noTSX Apr 21 13:53:55 user nova-compute[71474]: Haswell-IBRS Apr 21 13:53:55 user nova-compute[71474]: Haswell Apr 21 13:53:55 user nova-compute[71474]: EPYC-Rome Apr 21 13:53:55 user nova-compute[71474]: EPYC-Milan Apr 21 13:53:55 user nova-compute[71474]: EPYC-IBPB Apr 21 13:53:55 user nova-compute[71474]: EPYC Apr 21 13:53:55 user nova-compute[71474]: Dhyana Apr 21 13:53:55 user nova-compute[71474]: Cooperlake Apr 21 13:53:55 user nova-compute[71474]: Conroe Apr 21 13:53:55 user nova-compute[71474]: Cascadelake-Server-noTSX Apr 21 13:53:55 user nova-compute[71474]: Cascadelake-Server Apr 21 13:53:55 user nova-compute[71474]: Broadwell-noTSX-IBRS Apr 21 13:53:55 user nova-compute[71474]: Broadwell-noTSX Apr 21 13:53:55 user nova-compute[71474]: Broadwell-IBRS Apr 21 13:53:55 user nova-compute[71474]: Broadwell Apr 21 13:53:55 user nova-compute[71474]: 486 Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: file Apr 21 13:53:55 user nova-compute[71474]: anonymous Apr 21 13:53:55 user nova-compute[71474]: memfd Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: disk Apr 21 13:53:55 user nova-compute[71474]: cdrom Apr 21 13:53:55 user nova-compute[71474]: floppy Apr 21 13:53:55 user nova-compute[71474]: lun Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: fdc Apr 21 13:53:55 user nova-compute[71474]: scsi Apr 21 13:53:55 user nova-compute[71474]: virtio Apr 21 13:53:55 user nova-compute[71474]: usb Apr 21 13:53:55 user nova-compute[71474]: sata Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: virtio Apr 21 13:53:55 user nova-compute[71474]: virtio-transitional Apr 21 13:53:55 user nova-compute[71474]: virtio-non-transitional Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: sdl Apr 21 13:53:55 user nova-compute[71474]: vnc Apr 21 13:53:55 user nova-compute[71474]: spice Apr 21 13:53:55 user nova-compute[71474]: egl-headless Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: subsystem Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: default Apr 21 13:53:55 user nova-compute[71474]: mandatory Apr 21 13:53:55 user nova-compute[71474]: requisite Apr 21 13:53:55 user nova-compute[71474]: optional Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: usb Apr 21 13:53:55 user nova-compute[71474]: pci Apr 21 13:53:55 user nova-compute[71474]: scsi Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: virtio Apr 21 13:53:55 user nova-compute[71474]: virtio-transitional Apr 21 13:53:55 user nova-compute[71474]: virtio-non-transitional Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: random Apr 21 13:53:55 user nova-compute[71474]: egd Apr 21 13:53:55 user nova-compute[71474]: builtin Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: path Apr 21 13:53:55 user nova-compute[71474]: handle Apr 21 13:53:55 user nova-compute[71474]: virtiofs Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: tpm-tis Apr 21 13:53:55 user nova-compute[71474]: tpm-crb Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: passthrough Apr 21 13:53:55 user nova-compute[71474]: emulator Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: {{(pid=71474) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Apr 21 13:53:55 user nova-compute[71474]: DEBUG nova.virt.libvirt.host [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Libvirt host hypervisor capabilities for arch=i686 and machine_type=ubuntu-q35: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: /usr/bin/qemu-system-i386 Apr 21 13:53:55 user nova-compute[71474]: kvm Apr 21 13:53:55 user nova-compute[71474]: pc-q35-jammy Apr 21 13:53:55 user nova-compute[71474]: i686 Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: /usr/share/OVMF/OVMF_CODE.fd Apr 21 13:53:55 user nova-compute[71474]: /usr/share/OVMF/OVMF_CODE.secboot.fd Apr 21 13:53:55 user nova-compute[71474]: /usr/share/AAVMF/AAVMF_CODE.fd Apr 21 13:53:55 user nova-compute[71474]: /usr/share/AAVMF/AAVMF32_CODE.fd Apr 21 13:53:55 user nova-compute[71474]: /usr/share/OVMF/OVMF_CODE.ms.fd Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: rom Apr 21 13:53:55 user nova-compute[71474]: pflash Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: yes Apr 21 13:53:55 user nova-compute[71474]: no Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: no Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: off Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: on Apr 21 13:53:55 user nova-compute[71474]: off Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: IvyBridge-IBRS Apr 21 13:53:55 user nova-compute[71474]: Intel Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: qemu64 Apr 21 13:53:55 user nova-compute[71474]: qemu32 Apr 21 13:53:55 user nova-compute[71474]: phenom Apr 21 13:53:55 user nova-compute[71474]: pentium3 Apr 21 13:53:55 user nova-compute[71474]: pentium2 Apr 21 13:53:55 user nova-compute[71474]: pentium Apr 21 13:53:55 user nova-compute[71474]: n270 Apr 21 13:53:55 user nova-compute[71474]: kvm64 Apr 21 13:53:55 user nova-compute[71474]: kvm32 Apr 21 13:53:55 user nova-compute[71474]: coreduo Apr 21 13:53:55 user nova-compute[71474]: core2duo Apr 21 13:53:55 user nova-compute[71474]: athlon Apr 21 13:53:55 user nova-compute[71474]: Westmere-IBRS Apr 21 13:53:55 user nova-compute[71474]: Westmere Apr 21 13:53:55 user nova-compute[71474]: Snowridge Apr 21 13:53:55 user nova-compute[71474]: Skylake-Server-noTSX-IBRS Apr 21 13:53:55 user nova-compute[71474]: Skylake-Server-IBRS Apr 21 13:53:55 user nova-compute[71474]: Skylake-Server Apr 21 13:53:55 user nova-compute[71474]: Skylake-Client-noTSX-IBRS Apr 21 13:53:55 user nova-compute[71474]: Skylake-Client-IBRS Apr 21 13:53:55 user nova-compute[71474]: Skylake-Client Apr 21 13:53:55 user nova-compute[71474]: SandyBridge-IBRS Apr 21 13:53:55 user nova-compute[71474]: SandyBridge Apr 21 13:53:55 user nova-compute[71474]: Penryn Apr 21 13:53:55 user nova-compute[71474]: Opteron_G5 Apr 21 13:53:55 user nova-compute[71474]: Opteron_G4 Apr 21 13:53:55 user nova-compute[71474]: Opteron_G3 Apr 21 13:53:55 user nova-compute[71474]: Opteron_G2 Apr 21 13:53:55 user nova-compute[71474]: Opteron_G1 Apr 21 13:53:55 user nova-compute[71474]: Nehalem-IBRS Apr 21 13:53:55 user nova-compute[71474]: Nehalem Apr 21 13:53:55 user nova-compute[71474]: IvyBridge-IBRS Apr 21 13:53:55 user nova-compute[71474]: IvyBridge Apr 21 13:53:55 user nova-compute[71474]: Icelake-Server-noTSX Apr 21 13:53:55 user nova-compute[71474]: Icelake-Server Apr 21 13:53:55 user nova-compute[71474]: Icelake-Client-noTSX Apr 21 13:53:55 user nova-compute[71474]: Icelake-Client Apr 21 13:53:55 user nova-compute[71474]: Haswell-noTSX-IBRS Apr 21 13:53:55 user nova-compute[71474]: Haswell-noTSX Apr 21 13:53:55 user nova-compute[71474]: Haswell-IBRS Apr 21 13:53:55 user nova-compute[71474]: Haswell Apr 21 13:53:55 user nova-compute[71474]: EPYC-Rome Apr 21 13:53:55 user nova-compute[71474]: EPYC-Milan Apr 21 13:53:55 user nova-compute[71474]: EPYC-IBPB Apr 21 13:53:55 user nova-compute[71474]: EPYC Apr 21 13:53:55 user nova-compute[71474]: Dhyana Apr 21 13:53:55 user nova-compute[71474]: Cooperlake Apr 21 13:53:55 user nova-compute[71474]: Conroe Apr 21 13:53:55 user nova-compute[71474]: Cascadelake-Server-noTSX Apr 21 13:53:55 user nova-compute[71474]: Cascadelake-Server Apr 21 13:53:55 user nova-compute[71474]: Broadwell-noTSX-IBRS Apr 21 13:53:55 user nova-compute[71474]: Broadwell-noTSX Apr 21 13:53:55 user nova-compute[71474]: Broadwell-IBRS Apr 21 13:53:55 user nova-compute[71474]: Broadwell Apr 21 13:53:55 user nova-compute[71474]: 486 Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: file Apr 21 13:53:55 user nova-compute[71474]: anonymous Apr 21 13:53:55 user nova-compute[71474]: memfd Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: disk Apr 21 13:53:55 user nova-compute[71474]: cdrom Apr 21 13:53:55 user nova-compute[71474]: floppy Apr 21 13:53:55 user nova-compute[71474]: lun Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: fdc Apr 21 13:53:55 user nova-compute[71474]: scsi Apr 21 13:53:55 user nova-compute[71474]: virtio Apr 21 13:53:55 user nova-compute[71474]: usb Apr 21 13:53:55 user nova-compute[71474]: sata Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: virtio Apr 21 13:53:55 user nova-compute[71474]: virtio-transitional Apr 21 13:53:55 user nova-compute[71474]: virtio-non-transitional Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: sdl Apr 21 13:53:55 user nova-compute[71474]: vnc Apr 21 13:53:55 user nova-compute[71474]: spice Apr 21 13:53:55 user nova-compute[71474]: egl-headless Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: subsystem Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: default Apr 21 13:53:55 user nova-compute[71474]: mandatory Apr 21 13:53:55 user nova-compute[71474]: requisite Apr 21 13:53:55 user nova-compute[71474]: optional Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: usb Apr 21 13:53:55 user nova-compute[71474]: pci Apr 21 13:53:55 user nova-compute[71474]: scsi Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: virtio Apr 21 13:53:55 user nova-compute[71474]: virtio-transitional Apr 21 13:53:55 user nova-compute[71474]: virtio-non-transitional Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: random Apr 21 13:53:55 user nova-compute[71474]: egd Apr 21 13:53:55 user nova-compute[71474]: builtin Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: path Apr 21 13:53:55 user nova-compute[71474]: handle Apr 21 13:53:55 user nova-compute[71474]: virtiofs Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: tpm-tis Apr 21 13:53:55 user nova-compute[71474]: tpm-crb Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: passthrough Apr 21 13:53:55 user nova-compute[71474]: emulator Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: {{(pid=71474) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Apr 21 13:53:55 user nova-compute[71474]: DEBUG nova.virt.libvirt.host [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Libvirt host hypervisor capabilities for arch=i686 and machine_type=ubuntu: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: /usr/bin/qemu-system-i386 Apr 21 13:53:55 user nova-compute[71474]: kvm Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-jammy Apr 21 13:53:55 user nova-compute[71474]: i686 Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: /usr/share/OVMF/OVMF_CODE.fd Apr 21 13:53:55 user nova-compute[71474]: /usr/share/OVMF/OVMF_CODE.secboot.fd Apr 21 13:53:55 user nova-compute[71474]: /usr/share/AAVMF/AAVMF_CODE.fd Apr 21 13:53:55 user nova-compute[71474]: /usr/share/AAVMF/AAVMF32_CODE.fd Apr 21 13:53:55 user nova-compute[71474]: /usr/share/OVMF/OVMF_CODE.ms.fd Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: rom Apr 21 13:53:55 user nova-compute[71474]: pflash Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: yes Apr 21 13:53:55 user nova-compute[71474]: no Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: no Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: off Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: on Apr 21 13:53:55 user nova-compute[71474]: off Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: IvyBridge-IBRS Apr 21 13:53:55 user nova-compute[71474]: Intel Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: qemu64 Apr 21 13:53:55 user nova-compute[71474]: qemu32 Apr 21 13:53:55 user nova-compute[71474]: phenom Apr 21 13:53:55 user nova-compute[71474]: pentium3 Apr 21 13:53:55 user nova-compute[71474]: pentium2 Apr 21 13:53:55 user nova-compute[71474]: pentium Apr 21 13:53:55 user nova-compute[71474]: n270 Apr 21 13:53:55 user nova-compute[71474]: kvm64 Apr 21 13:53:55 user nova-compute[71474]: kvm32 Apr 21 13:53:55 user nova-compute[71474]: coreduo Apr 21 13:53:55 user nova-compute[71474]: core2duo Apr 21 13:53:55 user nova-compute[71474]: athlon Apr 21 13:53:55 user nova-compute[71474]: Westmere-IBRS Apr 21 13:53:55 user nova-compute[71474]: Westmere Apr 21 13:53:55 user nova-compute[71474]: Snowridge Apr 21 13:53:55 user nova-compute[71474]: Skylake-Server-noTSX-IBRS Apr 21 13:53:55 user nova-compute[71474]: Skylake-Server-IBRS Apr 21 13:53:55 user nova-compute[71474]: Skylake-Server Apr 21 13:53:55 user nova-compute[71474]: Skylake-Client-noTSX-IBRS Apr 21 13:53:55 user nova-compute[71474]: Skylake-Client-IBRS Apr 21 13:53:55 user nova-compute[71474]: Skylake-Client Apr 21 13:53:55 user nova-compute[71474]: SandyBridge-IBRS Apr 21 13:53:55 user nova-compute[71474]: SandyBridge Apr 21 13:53:55 user nova-compute[71474]: Penryn Apr 21 13:53:55 user nova-compute[71474]: Opteron_G5 Apr 21 13:53:55 user nova-compute[71474]: Opteron_G4 Apr 21 13:53:55 user nova-compute[71474]: Opteron_G3 Apr 21 13:53:55 user nova-compute[71474]: Opteron_G2 Apr 21 13:53:55 user nova-compute[71474]: Opteron_G1 Apr 21 13:53:55 user nova-compute[71474]: Nehalem-IBRS Apr 21 13:53:55 user nova-compute[71474]: Nehalem Apr 21 13:53:55 user nova-compute[71474]: IvyBridge-IBRS Apr 21 13:53:55 user nova-compute[71474]: IvyBridge Apr 21 13:53:55 user nova-compute[71474]: Icelake-Server-noTSX Apr 21 13:53:55 user nova-compute[71474]: Icelake-Server Apr 21 13:53:55 user nova-compute[71474]: Icelake-Client-noTSX Apr 21 13:53:55 user nova-compute[71474]: Icelake-Client Apr 21 13:53:55 user nova-compute[71474]: Haswell-noTSX-IBRS Apr 21 13:53:55 user nova-compute[71474]: Haswell-noTSX Apr 21 13:53:55 user nova-compute[71474]: Haswell-IBRS Apr 21 13:53:55 user nova-compute[71474]: Haswell Apr 21 13:53:55 user nova-compute[71474]: EPYC-Rome Apr 21 13:53:55 user nova-compute[71474]: EPYC-Milan Apr 21 13:53:55 user nova-compute[71474]: EPYC-IBPB Apr 21 13:53:55 user nova-compute[71474]: EPYC Apr 21 13:53:55 user nova-compute[71474]: Dhyana Apr 21 13:53:55 user nova-compute[71474]: Cooperlake Apr 21 13:53:55 user nova-compute[71474]: Conroe Apr 21 13:53:55 user nova-compute[71474]: Cascadelake-Server-noTSX Apr 21 13:53:55 user nova-compute[71474]: Cascadelake-Server Apr 21 13:53:55 user nova-compute[71474]: Broadwell-noTSX-IBRS Apr 21 13:53:55 user nova-compute[71474]: Broadwell-noTSX Apr 21 13:53:55 user nova-compute[71474]: Broadwell-IBRS Apr 21 13:53:55 user nova-compute[71474]: Broadwell Apr 21 13:53:55 user nova-compute[71474]: 486 Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: file Apr 21 13:53:55 user nova-compute[71474]: anonymous Apr 21 13:53:55 user nova-compute[71474]: memfd Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: disk Apr 21 13:53:55 user nova-compute[71474]: cdrom Apr 21 13:53:55 user nova-compute[71474]: floppy Apr 21 13:53:55 user nova-compute[71474]: lun Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: ide Apr 21 13:53:55 user nova-compute[71474]: fdc Apr 21 13:53:55 user nova-compute[71474]: scsi Apr 21 13:53:55 user nova-compute[71474]: virtio Apr 21 13:53:55 user nova-compute[71474]: usb Apr 21 13:53:55 user nova-compute[71474]: sata Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: virtio Apr 21 13:53:55 user nova-compute[71474]: virtio-transitional Apr 21 13:53:55 user nova-compute[71474]: virtio-non-transitional Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: sdl Apr 21 13:53:55 user nova-compute[71474]: vnc Apr 21 13:53:55 user nova-compute[71474]: spice Apr 21 13:53:55 user nova-compute[71474]: egl-headless Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: subsystem Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: default Apr 21 13:53:55 user nova-compute[71474]: mandatory Apr 21 13:53:55 user nova-compute[71474]: requisite Apr 21 13:53:55 user nova-compute[71474]: optional Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: usb Apr 21 13:53:55 user nova-compute[71474]: pci Apr 21 13:53:55 user nova-compute[71474]: scsi Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: virtio Apr 21 13:53:55 user nova-compute[71474]: virtio-transitional Apr 21 13:53:55 user nova-compute[71474]: virtio-non-transitional Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: random Apr 21 13:53:55 user nova-compute[71474]: egd Apr 21 13:53:55 user nova-compute[71474]: builtin Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: path Apr 21 13:53:55 user nova-compute[71474]: handle Apr 21 13:53:55 user nova-compute[71474]: virtiofs Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: tpm-tis Apr 21 13:53:55 user nova-compute[71474]: tpm-crb Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: passthrough Apr 21 13:53:55 user nova-compute[71474]: emulator Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: {{(pid=71474) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Apr 21 13:53:55 user nova-compute[71474]: DEBUG nova.virt.libvirt.host [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: /usr/bin/qemu-system-i386 Apr 21 13:53:55 user nova-compute[71474]: kvm Apr 21 13:53:55 user nova-compute[71474]: pc-i440fx-6.2 Apr 21 13:53:55 user nova-compute[71474]: i686 Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: Apr 21 13:53:55 user nova-compute[71474]: /usr/share/OVMF/OVMF_CODE.fd Apr 21 13:53:55 user nova-compute[71474]: /usr/share/OVMF/OVMF_CODE.secboot.fd Apr 21 13:53:55 user nova-compute[71474]: /usr/share/AAVMF/AAVMF_CODE.fd Apr 21 13:53:56 user nova-compute[71474]: /usr/share/AAVMF/AAVMF32_CODE.fd Apr 21 13:53:56 user nova-compute[71474]: /usr/share/OVMF/OVMF_CODE.ms.fd Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: rom Apr 21 13:53:56 user nova-compute[71474]: pflash Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: yes Apr 21 13:53:56 user nova-compute[71474]: no Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: no Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: off Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: on Apr 21 13:53:56 user nova-compute[71474]: off Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: IvyBridge-IBRS Apr 21 13:53:56 user nova-compute[71474]: Intel Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: qemu64 Apr 21 13:53:56 user nova-compute[71474]: qemu32 Apr 21 13:53:56 user nova-compute[71474]: phenom Apr 21 13:53:56 user nova-compute[71474]: pentium3 Apr 21 13:53:56 user nova-compute[71474]: pentium2 Apr 21 13:53:56 user nova-compute[71474]: pentium Apr 21 13:53:56 user nova-compute[71474]: n270 Apr 21 13:53:56 user nova-compute[71474]: kvm64 Apr 21 13:53:56 user nova-compute[71474]: kvm32 Apr 21 13:53:56 user nova-compute[71474]: coreduo Apr 21 13:53:56 user nova-compute[71474]: core2duo Apr 21 13:53:56 user nova-compute[71474]: athlon Apr 21 13:53:56 user nova-compute[71474]: Westmere-IBRS Apr 21 13:53:56 user nova-compute[71474]: Westmere Apr 21 13:53:56 user nova-compute[71474]: Snowridge Apr 21 13:53:56 user nova-compute[71474]: Skylake-Server-noTSX-IBRS Apr 21 13:53:56 user nova-compute[71474]: Skylake-Server-IBRS Apr 21 13:53:56 user nova-compute[71474]: Skylake-Server Apr 21 13:53:56 user nova-compute[71474]: Skylake-Client-noTSX-IBRS Apr 21 13:53:56 user nova-compute[71474]: Skylake-Client-IBRS Apr 21 13:53:56 user nova-compute[71474]: Skylake-Client Apr 21 13:53:56 user nova-compute[71474]: SandyBridge-IBRS Apr 21 13:53:56 user nova-compute[71474]: SandyBridge Apr 21 13:53:56 user nova-compute[71474]: Penryn Apr 21 13:53:56 user nova-compute[71474]: Opteron_G5 Apr 21 13:53:56 user nova-compute[71474]: Opteron_G4 Apr 21 13:53:56 user nova-compute[71474]: Opteron_G3 Apr 21 13:53:56 user nova-compute[71474]: Opteron_G2 Apr 21 13:53:56 user nova-compute[71474]: Opteron_G1 Apr 21 13:53:56 user nova-compute[71474]: Nehalem-IBRS Apr 21 13:53:56 user nova-compute[71474]: Nehalem Apr 21 13:53:56 user nova-compute[71474]: IvyBridge-IBRS Apr 21 13:53:56 user nova-compute[71474]: IvyBridge Apr 21 13:53:56 user nova-compute[71474]: Icelake-Server-noTSX Apr 21 13:53:56 user nova-compute[71474]: Icelake-Server Apr 21 13:53:56 user nova-compute[71474]: Icelake-Client-noTSX Apr 21 13:53:56 user nova-compute[71474]: Icelake-Client Apr 21 13:53:56 user nova-compute[71474]: Haswell-noTSX-IBRS Apr 21 13:53:56 user nova-compute[71474]: Haswell-noTSX Apr 21 13:53:56 user nova-compute[71474]: Haswell-IBRS Apr 21 13:53:56 user nova-compute[71474]: Haswell Apr 21 13:53:56 user nova-compute[71474]: EPYC-Rome Apr 21 13:53:56 user nova-compute[71474]: EPYC-Milan Apr 21 13:53:56 user nova-compute[71474]: EPYC-IBPB Apr 21 13:53:56 user nova-compute[71474]: EPYC Apr 21 13:53:56 user nova-compute[71474]: Dhyana Apr 21 13:53:56 user nova-compute[71474]: Cooperlake Apr 21 13:53:56 user nova-compute[71474]: Conroe Apr 21 13:53:56 user nova-compute[71474]: Cascadelake-Server-noTSX Apr 21 13:53:56 user nova-compute[71474]: Cascadelake-Server Apr 21 13:53:56 user nova-compute[71474]: Broadwell-noTSX-IBRS Apr 21 13:53:56 user nova-compute[71474]: Broadwell-noTSX Apr 21 13:53:56 user nova-compute[71474]: Broadwell-IBRS Apr 21 13:53:56 user nova-compute[71474]: Broadwell Apr 21 13:53:56 user nova-compute[71474]: 486 Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: file Apr 21 13:53:56 user nova-compute[71474]: anonymous Apr 21 13:53:56 user nova-compute[71474]: memfd Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: disk Apr 21 13:53:56 user nova-compute[71474]: cdrom Apr 21 13:53:56 user nova-compute[71474]: floppy Apr 21 13:53:56 user nova-compute[71474]: lun Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: ide Apr 21 13:53:56 user nova-compute[71474]: fdc Apr 21 13:53:56 user nova-compute[71474]: scsi Apr 21 13:53:56 user nova-compute[71474]: virtio Apr 21 13:53:56 user nova-compute[71474]: usb Apr 21 13:53:56 user nova-compute[71474]: sata Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: virtio Apr 21 13:53:56 user nova-compute[71474]: virtio-transitional Apr 21 13:53:56 user nova-compute[71474]: virtio-non-transitional Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: sdl Apr 21 13:53:56 user nova-compute[71474]: vnc Apr 21 13:53:56 user nova-compute[71474]: spice Apr 21 13:53:56 user nova-compute[71474]: egl-headless Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: subsystem Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: default Apr 21 13:53:56 user nova-compute[71474]: mandatory Apr 21 13:53:56 user nova-compute[71474]: requisite Apr 21 13:53:56 user nova-compute[71474]: optional Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: usb Apr 21 13:53:56 user nova-compute[71474]: pci Apr 21 13:53:56 user nova-compute[71474]: scsi Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: virtio Apr 21 13:53:56 user nova-compute[71474]: virtio-transitional Apr 21 13:53:56 user nova-compute[71474]: virtio-non-transitional Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: random Apr 21 13:53:56 user nova-compute[71474]: egd Apr 21 13:53:56 user nova-compute[71474]: builtin Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: path Apr 21 13:53:56 user nova-compute[71474]: handle Apr 21 13:53:56 user nova-compute[71474]: virtiofs Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: tpm-tis Apr 21 13:53:56 user nova-compute[71474]: tpm-crb Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: passthrough Apr 21 13:53:56 user nova-compute[71474]: emulator Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: {{(pid=71474) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Apr 21 13:53:56 user nova-compute[71474]: DEBUG nova.virt.libvirt.host [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Getting domain capabilities for m68k via machine types: {'virt', None} {{(pid=71474) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 21 13:53:56 user nova-compute[71474]: DEBUG nova.virt.libvirt.host [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Error from libvirt when retrieving domain capabilities for arch m68k / virt_type kvm / machine_type virt: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-m68k' on this host {{(pid=71474) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 21 13:53:56 user nova-compute[71474]: DEBUG nova.virt.libvirt.host [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Error from libvirt when retrieving domain capabilities for arch m68k / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-m68k' on this host {{(pid=71474) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 21 13:53:56 user nova-compute[71474]: DEBUG nova.virt.libvirt.host [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Getting domain capabilities for microblaze via machine types: {None} {{(pid=71474) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 21 13:53:56 user nova-compute[71474]: DEBUG nova.virt.libvirt.host [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Error from libvirt when retrieving domain capabilities for arch microblaze / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-microblaze' on this host {{(pid=71474) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 21 13:53:56 user nova-compute[71474]: DEBUG nova.virt.libvirt.host [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Getting domain capabilities for microblazeel via machine types: {None} {{(pid=71474) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 21 13:53:56 user nova-compute[71474]: DEBUG nova.virt.libvirt.host [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Error from libvirt when retrieving domain capabilities for arch microblazeel / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-microblazeel' on this host {{(pid=71474) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 21 13:53:56 user nova-compute[71474]: DEBUG nova.virt.libvirt.host [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Getting domain capabilities for mips via machine types: {None} {{(pid=71474) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 21 13:53:56 user nova-compute[71474]: DEBUG nova.virt.libvirt.host [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Error from libvirt when retrieving domain capabilities for arch mips / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-mips' on this host {{(pid=71474) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 21 13:53:56 user nova-compute[71474]: DEBUG nova.virt.libvirt.host [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Getting domain capabilities for mipsel via machine types: {None} {{(pid=71474) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 21 13:53:56 user nova-compute[71474]: DEBUG nova.virt.libvirt.host [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Error from libvirt when retrieving domain capabilities for arch mipsel / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-mipsel' on this host {{(pid=71474) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 21 13:53:56 user nova-compute[71474]: DEBUG nova.virt.libvirt.host [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Getting domain capabilities for mips64 via machine types: {None} {{(pid=71474) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 21 13:53:56 user nova-compute[71474]: DEBUG nova.virt.libvirt.host [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Error from libvirt when retrieving domain capabilities for arch mips64 / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-mips64' on this host {{(pid=71474) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 21 13:53:56 user nova-compute[71474]: DEBUG nova.virt.libvirt.host [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Getting domain capabilities for mips64el via machine types: {None} {{(pid=71474) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 21 13:53:56 user nova-compute[71474]: DEBUG nova.virt.libvirt.host [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Error from libvirt when retrieving domain capabilities for arch mips64el / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-mips64el' on this host {{(pid=71474) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 21 13:53:56 user nova-compute[71474]: DEBUG nova.virt.libvirt.host [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Getting domain capabilities for ppc via machine types: {None} {{(pid=71474) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 21 13:53:56 user nova-compute[71474]: DEBUG nova.virt.libvirt.host [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Error from libvirt when retrieving domain capabilities for arch ppc / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-ppc' on this host {{(pid=71474) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 21 13:53:56 user nova-compute[71474]: DEBUG nova.virt.libvirt.host [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Getting domain capabilities for ppc64 via machine types: {'powernv', None, 'pseries'} {{(pid=71474) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 21 13:53:56 user nova-compute[71474]: DEBUG nova.virt.libvirt.host [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Error from libvirt when retrieving domain capabilities for arch ppc64 / virt_type kvm / machine_type powernv: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-ppc64' on this host {{(pid=71474) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 21 13:53:56 user nova-compute[71474]: DEBUG nova.virt.libvirt.host [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Error from libvirt when retrieving domain capabilities for arch ppc64 / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-ppc64' on this host {{(pid=71474) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 21 13:53:56 user nova-compute[71474]: DEBUG nova.virt.libvirt.host [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Error from libvirt when retrieving domain capabilities for arch ppc64 / virt_type kvm / machine_type pseries: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-ppc64' on this host {{(pid=71474) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 21 13:53:56 user nova-compute[71474]: DEBUG nova.virt.libvirt.host [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Getting domain capabilities for ppc64le via machine types: {'powernv', 'pseries'} {{(pid=71474) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 21 13:53:56 user nova-compute[71474]: DEBUG nova.virt.libvirt.host [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Error from libvirt when retrieving domain capabilities for arch ppc64le / virt_type kvm / machine_type powernv: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-ppc64le' on this host {{(pid=71474) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 21 13:53:56 user nova-compute[71474]: DEBUG nova.virt.libvirt.host [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Error from libvirt when retrieving domain capabilities for arch ppc64le / virt_type kvm / machine_type pseries: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-ppc64le' on this host {{(pid=71474) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 21 13:53:56 user nova-compute[71474]: DEBUG nova.virt.libvirt.host [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Getting domain capabilities for riscv32 via machine types: {None} {{(pid=71474) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 21 13:53:56 user nova-compute[71474]: DEBUG nova.virt.libvirt.host [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Error from libvirt when retrieving domain capabilities for arch riscv32 / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-riscv32' on this host {{(pid=71474) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 21 13:53:56 user nova-compute[71474]: DEBUG nova.virt.libvirt.host [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Getting domain capabilities for riscv64 via machine types: {None} {{(pid=71474) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 21 13:53:56 user nova-compute[71474]: DEBUG nova.virt.libvirt.host [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Error from libvirt when retrieving domain capabilities for arch riscv64 / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-riscv64' on this host {{(pid=71474) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 21 13:53:56 user nova-compute[71474]: DEBUG nova.virt.libvirt.host [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Getting domain capabilities for s390x via machine types: {'s390-ccw-virtio'} {{(pid=71474) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 21 13:53:56 user nova-compute[71474]: DEBUG nova.virt.libvirt.host [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Error from libvirt when retrieving domain capabilities for arch s390x / virt_type kvm / machine_type s390-ccw-virtio: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-s390x' on this host {{(pid=71474) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 21 13:53:56 user nova-compute[71474]: DEBUG nova.virt.libvirt.host [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Getting domain capabilities for sh4 via machine types: {None} {{(pid=71474) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 21 13:53:56 user nova-compute[71474]: DEBUG nova.virt.libvirt.host [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Error from libvirt when retrieving domain capabilities for arch sh4 / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-sh4' on this host {{(pid=71474) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 21 13:53:56 user nova-compute[71474]: DEBUG nova.virt.libvirt.host [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Getting domain capabilities for sh4eb via machine types: {None} {{(pid=71474) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 21 13:53:56 user nova-compute[71474]: DEBUG nova.virt.libvirt.host [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Error from libvirt when retrieving domain capabilities for arch sh4eb / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-sh4eb' on this host {{(pid=71474) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 21 13:53:56 user nova-compute[71474]: DEBUG nova.virt.libvirt.host [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Getting domain capabilities for sparc via machine types: {None} {{(pid=71474) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 21 13:53:56 user nova-compute[71474]: DEBUG nova.virt.libvirt.host [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Error from libvirt when retrieving domain capabilities for arch sparc / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-sparc' on this host {{(pid=71474) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 21 13:53:56 user nova-compute[71474]: DEBUG nova.virt.libvirt.host [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Getting domain capabilities for sparc64 via machine types: {None} {{(pid=71474) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 21 13:53:56 user nova-compute[71474]: DEBUG nova.virt.libvirt.host [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Error from libvirt when retrieving domain capabilities for arch sparc64 / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-sparc64' on this host {{(pid=71474) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 21 13:53:56 user nova-compute[71474]: DEBUG nova.virt.libvirt.host [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Getting domain capabilities for x86_64 via machine types: {'q35', 'ubuntu-q35', 'ubuntu', 'pc'} {{(pid=71474) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 21 13:53:56 user nova-compute[71474]: DEBUG nova.virt.libvirt.host [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: /usr/bin/qemu-system-x86_64 Apr 21 13:53:56 user nova-compute[71474]: kvm Apr 21 13:53:56 user nova-compute[71474]: pc-q35-6.2 Apr 21 13:53:56 user nova-compute[71474]: x86_64 Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: efi Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: /usr/share/OVMF/OVMF_CODE_4M.ms.fd Apr 21 13:53:56 user nova-compute[71474]: /usr/share/OVMF/OVMF_CODE_4M.secboot.fd Apr 21 13:53:56 user nova-compute[71474]: /usr/share/OVMF/OVMF_CODE_4M.fd Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: rom Apr 21 13:53:56 user nova-compute[71474]: pflash Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: yes Apr 21 13:53:56 user nova-compute[71474]: no Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: yes Apr 21 13:53:56 user nova-compute[71474]: no Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: on Apr 21 13:53:56 user nova-compute[71474]: off Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: on Apr 21 13:53:56 user nova-compute[71474]: off Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: IvyBridge-IBRS Apr 21 13:53:56 user nova-compute[71474]: Intel Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: qemu64 Apr 21 13:53:56 user nova-compute[71474]: qemu32 Apr 21 13:53:56 user nova-compute[71474]: phenom Apr 21 13:53:56 user nova-compute[71474]: pentium3 Apr 21 13:53:56 user nova-compute[71474]: pentium2 Apr 21 13:53:56 user nova-compute[71474]: pentium Apr 21 13:53:56 user nova-compute[71474]: n270 Apr 21 13:53:56 user nova-compute[71474]: kvm64 Apr 21 13:53:56 user nova-compute[71474]: kvm32 Apr 21 13:53:56 user nova-compute[71474]: coreduo Apr 21 13:53:56 user nova-compute[71474]: core2duo Apr 21 13:53:56 user nova-compute[71474]: athlon Apr 21 13:53:56 user nova-compute[71474]: Westmere-IBRS Apr 21 13:53:56 user nova-compute[71474]: Westmere Apr 21 13:53:56 user nova-compute[71474]: Snowridge Apr 21 13:53:56 user nova-compute[71474]: Skylake-Server-noTSX-IBRS Apr 21 13:53:56 user nova-compute[71474]: Skylake-Server-IBRS Apr 21 13:53:56 user nova-compute[71474]: Skylake-Server Apr 21 13:53:56 user nova-compute[71474]: Skylake-Client-noTSX-IBRS Apr 21 13:53:56 user nova-compute[71474]: Skylake-Client-IBRS Apr 21 13:53:56 user nova-compute[71474]: Skylake-Client Apr 21 13:53:56 user nova-compute[71474]: SandyBridge-IBRS Apr 21 13:53:56 user nova-compute[71474]: SandyBridge Apr 21 13:53:56 user nova-compute[71474]: Penryn Apr 21 13:53:56 user nova-compute[71474]: Opteron_G5 Apr 21 13:53:56 user nova-compute[71474]: Opteron_G4 Apr 21 13:53:56 user nova-compute[71474]: Opteron_G3 Apr 21 13:53:56 user nova-compute[71474]: Opteron_G2 Apr 21 13:53:56 user nova-compute[71474]: Opteron_G1 Apr 21 13:53:56 user nova-compute[71474]: Nehalem-IBRS Apr 21 13:53:56 user nova-compute[71474]: Nehalem Apr 21 13:53:56 user nova-compute[71474]: IvyBridge-IBRS Apr 21 13:53:56 user nova-compute[71474]: IvyBridge Apr 21 13:53:56 user nova-compute[71474]: Icelake-Server-noTSX Apr 21 13:53:56 user nova-compute[71474]: Icelake-Server Apr 21 13:53:56 user nova-compute[71474]: Icelake-Client-noTSX Apr 21 13:53:56 user nova-compute[71474]: Icelake-Client Apr 21 13:53:56 user nova-compute[71474]: Haswell-noTSX-IBRS Apr 21 13:53:56 user nova-compute[71474]: Haswell-noTSX Apr 21 13:53:56 user nova-compute[71474]: Haswell-IBRS Apr 21 13:53:56 user nova-compute[71474]: Haswell Apr 21 13:53:56 user nova-compute[71474]: EPYC-Rome Apr 21 13:53:56 user nova-compute[71474]: EPYC-Milan Apr 21 13:53:56 user nova-compute[71474]: EPYC-IBPB Apr 21 13:53:56 user nova-compute[71474]: EPYC Apr 21 13:53:56 user nova-compute[71474]: Dhyana Apr 21 13:53:56 user nova-compute[71474]: Cooperlake Apr 21 13:53:56 user nova-compute[71474]: Conroe Apr 21 13:53:56 user nova-compute[71474]: Cascadelake-Server-noTSX Apr 21 13:53:56 user nova-compute[71474]: Cascadelake-Server Apr 21 13:53:56 user nova-compute[71474]: Broadwell-noTSX-IBRS Apr 21 13:53:56 user nova-compute[71474]: Broadwell-noTSX Apr 21 13:53:56 user nova-compute[71474]: Broadwell-IBRS Apr 21 13:53:56 user nova-compute[71474]: Broadwell Apr 21 13:53:56 user nova-compute[71474]: 486 Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: file Apr 21 13:53:56 user nova-compute[71474]: anonymous Apr 21 13:53:56 user nova-compute[71474]: memfd Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: disk Apr 21 13:53:56 user nova-compute[71474]: cdrom Apr 21 13:53:56 user nova-compute[71474]: floppy Apr 21 13:53:56 user nova-compute[71474]: lun Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: fdc Apr 21 13:53:56 user nova-compute[71474]: scsi Apr 21 13:53:56 user nova-compute[71474]: virtio Apr 21 13:53:56 user nova-compute[71474]: usb Apr 21 13:53:56 user nova-compute[71474]: sata Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: virtio Apr 21 13:53:56 user nova-compute[71474]: virtio-transitional Apr 21 13:53:56 user nova-compute[71474]: virtio-non-transitional Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: sdl Apr 21 13:53:56 user nova-compute[71474]: vnc Apr 21 13:53:56 user nova-compute[71474]: spice Apr 21 13:53:56 user nova-compute[71474]: egl-headless Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: subsystem Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: default Apr 21 13:53:56 user nova-compute[71474]: mandatory Apr 21 13:53:56 user nova-compute[71474]: requisite Apr 21 13:53:56 user nova-compute[71474]: optional Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: usb Apr 21 13:53:56 user nova-compute[71474]: pci Apr 21 13:53:56 user nova-compute[71474]: scsi Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: virtio Apr 21 13:53:56 user nova-compute[71474]: virtio-transitional Apr 21 13:53:56 user nova-compute[71474]: virtio-non-transitional Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: random Apr 21 13:53:56 user nova-compute[71474]: egd Apr 21 13:53:56 user nova-compute[71474]: builtin Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: path Apr 21 13:53:56 user nova-compute[71474]: handle Apr 21 13:53:56 user nova-compute[71474]: virtiofs Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: tpm-tis Apr 21 13:53:56 user nova-compute[71474]: tpm-crb Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: passthrough Apr 21 13:53:56 user nova-compute[71474]: emulator Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: {{(pid=71474) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Apr 21 13:53:56 user nova-compute[71474]: DEBUG nova.virt.libvirt.host [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=ubuntu-q35: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: /usr/bin/qemu-system-x86_64 Apr 21 13:53:56 user nova-compute[71474]: kvm Apr 21 13:53:56 user nova-compute[71474]: pc-q35-jammy Apr 21 13:53:56 user nova-compute[71474]: x86_64 Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: efi Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: /usr/share/OVMF/OVMF_CODE_4M.ms.fd Apr 21 13:53:56 user nova-compute[71474]: /usr/share/OVMF/OVMF_CODE_4M.secboot.fd Apr 21 13:53:56 user nova-compute[71474]: /usr/share/OVMF/OVMF_CODE_4M.fd Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: rom Apr 21 13:53:56 user nova-compute[71474]: pflash Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: yes Apr 21 13:53:56 user nova-compute[71474]: no Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: yes Apr 21 13:53:56 user nova-compute[71474]: no Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: on Apr 21 13:53:56 user nova-compute[71474]: off Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: on Apr 21 13:53:56 user nova-compute[71474]: off Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: IvyBridge-IBRS Apr 21 13:53:56 user nova-compute[71474]: Intel Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: qemu64 Apr 21 13:53:56 user nova-compute[71474]: qemu32 Apr 21 13:53:56 user nova-compute[71474]: phenom Apr 21 13:53:56 user nova-compute[71474]: pentium3 Apr 21 13:53:56 user nova-compute[71474]: pentium2 Apr 21 13:53:56 user nova-compute[71474]: pentium Apr 21 13:53:56 user nova-compute[71474]: n270 Apr 21 13:53:56 user nova-compute[71474]: kvm64 Apr 21 13:53:56 user nova-compute[71474]: kvm32 Apr 21 13:53:56 user nova-compute[71474]: coreduo Apr 21 13:53:56 user nova-compute[71474]: core2duo Apr 21 13:53:56 user nova-compute[71474]: athlon Apr 21 13:53:56 user nova-compute[71474]: Westmere-IBRS Apr 21 13:53:56 user nova-compute[71474]: Westmere Apr 21 13:53:56 user nova-compute[71474]: Snowridge Apr 21 13:53:56 user nova-compute[71474]: Skylake-Server-noTSX-IBRS Apr 21 13:53:56 user nova-compute[71474]: Skylake-Server-IBRS Apr 21 13:53:56 user nova-compute[71474]: Skylake-Server Apr 21 13:53:56 user nova-compute[71474]: Skylake-Client-noTSX-IBRS Apr 21 13:53:56 user nova-compute[71474]: Skylake-Client-IBRS Apr 21 13:53:56 user nova-compute[71474]: Skylake-Client Apr 21 13:53:56 user nova-compute[71474]: SandyBridge-IBRS Apr 21 13:53:56 user nova-compute[71474]: SandyBridge Apr 21 13:53:56 user nova-compute[71474]: Penryn Apr 21 13:53:56 user nova-compute[71474]: Opteron_G5 Apr 21 13:53:56 user nova-compute[71474]: Opteron_G4 Apr 21 13:53:56 user nova-compute[71474]: Opteron_G3 Apr 21 13:53:56 user nova-compute[71474]: Opteron_G2 Apr 21 13:53:56 user nova-compute[71474]: Opteron_G1 Apr 21 13:53:56 user nova-compute[71474]: Nehalem-IBRS Apr 21 13:53:56 user nova-compute[71474]: Nehalem Apr 21 13:53:56 user nova-compute[71474]: IvyBridge-IBRS Apr 21 13:53:56 user nova-compute[71474]: IvyBridge Apr 21 13:53:56 user nova-compute[71474]: Icelake-Server-noTSX Apr 21 13:53:56 user nova-compute[71474]: Icelake-Server Apr 21 13:53:56 user nova-compute[71474]: Icelake-Client-noTSX Apr 21 13:53:56 user nova-compute[71474]: Icelake-Client Apr 21 13:53:56 user nova-compute[71474]: Haswell-noTSX-IBRS Apr 21 13:53:56 user nova-compute[71474]: Haswell-noTSX Apr 21 13:53:56 user nova-compute[71474]: Haswell-IBRS Apr 21 13:53:56 user nova-compute[71474]: Haswell Apr 21 13:53:56 user nova-compute[71474]: EPYC-Rome Apr 21 13:53:56 user nova-compute[71474]: EPYC-Milan Apr 21 13:53:56 user nova-compute[71474]: EPYC-IBPB Apr 21 13:53:56 user nova-compute[71474]: EPYC Apr 21 13:53:56 user nova-compute[71474]: Dhyana Apr 21 13:53:56 user nova-compute[71474]: Cooperlake Apr 21 13:53:56 user nova-compute[71474]: Conroe Apr 21 13:53:56 user nova-compute[71474]: Cascadelake-Server-noTSX Apr 21 13:53:56 user nova-compute[71474]: Cascadelake-Server Apr 21 13:53:56 user nova-compute[71474]: Broadwell-noTSX-IBRS Apr 21 13:53:56 user nova-compute[71474]: Broadwell-noTSX Apr 21 13:53:56 user nova-compute[71474]: Broadwell-IBRS Apr 21 13:53:56 user nova-compute[71474]: Broadwell Apr 21 13:53:56 user nova-compute[71474]: 486 Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: file Apr 21 13:53:56 user nova-compute[71474]: anonymous Apr 21 13:53:56 user nova-compute[71474]: memfd Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: disk Apr 21 13:53:56 user nova-compute[71474]: cdrom Apr 21 13:53:56 user nova-compute[71474]: floppy Apr 21 13:53:56 user nova-compute[71474]: lun Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: fdc Apr 21 13:53:56 user nova-compute[71474]: scsi Apr 21 13:53:56 user nova-compute[71474]: virtio Apr 21 13:53:56 user nova-compute[71474]: usb Apr 21 13:53:56 user nova-compute[71474]: sata Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: virtio Apr 21 13:53:56 user nova-compute[71474]: virtio-transitional Apr 21 13:53:56 user nova-compute[71474]: virtio-non-transitional Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: sdl Apr 21 13:53:56 user nova-compute[71474]: vnc Apr 21 13:53:56 user nova-compute[71474]: spice Apr 21 13:53:56 user nova-compute[71474]: egl-headless Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: subsystem Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: default Apr 21 13:53:56 user nova-compute[71474]: mandatory Apr 21 13:53:56 user nova-compute[71474]: requisite Apr 21 13:53:56 user nova-compute[71474]: optional Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: usb Apr 21 13:53:56 user nova-compute[71474]: pci Apr 21 13:53:56 user nova-compute[71474]: scsi Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: virtio Apr 21 13:53:56 user nova-compute[71474]: virtio-transitional Apr 21 13:53:56 user nova-compute[71474]: virtio-non-transitional Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: random Apr 21 13:53:56 user nova-compute[71474]: egd Apr 21 13:53:56 user nova-compute[71474]: builtin Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: path Apr 21 13:53:56 user nova-compute[71474]: handle Apr 21 13:53:56 user nova-compute[71474]: virtiofs Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: tpm-tis Apr 21 13:53:56 user nova-compute[71474]: tpm-crb Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: passthrough Apr 21 13:53:56 user nova-compute[71474]: emulator Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: {{(pid=71474) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Apr 21 13:53:56 user nova-compute[71474]: DEBUG nova.virt.libvirt.host [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=ubuntu: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: /usr/bin/qemu-system-x86_64 Apr 21 13:53:56 user nova-compute[71474]: kvm Apr 21 13:53:56 user nova-compute[71474]: pc-i440fx-jammy Apr 21 13:53:56 user nova-compute[71474]: x86_64 Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: efi Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: /usr/share/OVMF/OVMF_CODE_4M.fd Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: rom Apr 21 13:53:56 user nova-compute[71474]: pflash Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: yes Apr 21 13:53:56 user nova-compute[71474]: no Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: no Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: on Apr 21 13:53:56 user nova-compute[71474]: off Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: on Apr 21 13:53:56 user nova-compute[71474]: off Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: IvyBridge-IBRS Apr 21 13:53:56 user nova-compute[71474]: Intel Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: qemu64 Apr 21 13:53:56 user nova-compute[71474]: qemu32 Apr 21 13:53:56 user nova-compute[71474]: phenom Apr 21 13:53:56 user nova-compute[71474]: pentium3 Apr 21 13:53:56 user nova-compute[71474]: pentium2 Apr 21 13:53:56 user nova-compute[71474]: pentium Apr 21 13:53:56 user nova-compute[71474]: n270 Apr 21 13:53:56 user nova-compute[71474]: kvm64 Apr 21 13:53:56 user nova-compute[71474]: kvm32 Apr 21 13:53:56 user nova-compute[71474]: coreduo Apr 21 13:53:56 user nova-compute[71474]: core2duo Apr 21 13:53:56 user nova-compute[71474]: athlon Apr 21 13:53:56 user nova-compute[71474]: Westmere-IBRS Apr 21 13:53:56 user nova-compute[71474]: Westmere Apr 21 13:53:56 user nova-compute[71474]: Snowridge Apr 21 13:53:56 user nova-compute[71474]: Skylake-Server-noTSX-IBRS Apr 21 13:53:56 user nova-compute[71474]: Skylake-Server-IBRS Apr 21 13:53:56 user nova-compute[71474]: Skylake-Server Apr 21 13:53:56 user nova-compute[71474]: Skylake-Client-noTSX-IBRS Apr 21 13:53:56 user nova-compute[71474]: Skylake-Client-IBRS Apr 21 13:53:56 user nova-compute[71474]: Skylake-Client Apr 21 13:53:56 user nova-compute[71474]: SandyBridge-IBRS Apr 21 13:53:56 user nova-compute[71474]: SandyBridge Apr 21 13:53:56 user nova-compute[71474]: Penryn Apr 21 13:53:56 user nova-compute[71474]: Opteron_G5 Apr 21 13:53:56 user nova-compute[71474]: Opteron_G4 Apr 21 13:53:56 user nova-compute[71474]: Opteron_G3 Apr 21 13:53:56 user nova-compute[71474]: Opteron_G2 Apr 21 13:53:56 user nova-compute[71474]: Opteron_G1 Apr 21 13:53:56 user nova-compute[71474]: Nehalem-IBRS Apr 21 13:53:56 user nova-compute[71474]: Nehalem Apr 21 13:53:56 user nova-compute[71474]: IvyBridge-IBRS Apr 21 13:53:56 user nova-compute[71474]: IvyBridge Apr 21 13:53:56 user nova-compute[71474]: Icelake-Server-noTSX Apr 21 13:53:56 user nova-compute[71474]: Icelake-Server Apr 21 13:53:56 user nova-compute[71474]: Icelake-Client-noTSX Apr 21 13:53:56 user nova-compute[71474]: Icelake-Client Apr 21 13:53:56 user nova-compute[71474]: Haswell-noTSX-IBRS Apr 21 13:53:56 user nova-compute[71474]: Haswell-noTSX Apr 21 13:53:56 user nova-compute[71474]: Haswell-IBRS Apr 21 13:53:56 user nova-compute[71474]: Haswell Apr 21 13:53:56 user nova-compute[71474]: EPYC-Rome Apr 21 13:53:56 user nova-compute[71474]: EPYC-Milan Apr 21 13:53:56 user nova-compute[71474]: EPYC-IBPB Apr 21 13:53:56 user nova-compute[71474]: EPYC Apr 21 13:53:56 user nova-compute[71474]: Dhyana Apr 21 13:53:56 user nova-compute[71474]: Cooperlake Apr 21 13:53:56 user nova-compute[71474]: Conroe Apr 21 13:53:56 user nova-compute[71474]: Cascadelake-Server-noTSX Apr 21 13:53:56 user nova-compute[71474]: Cascadelake-Server Apr 21 13:53:56 user nova-compute[71474]: Broadwell-noTSX-IBRS Apr 21 13:53:56 user nova-compute[71474]: Broadwell-noTSX Apr 21 13:53:56 user nova-compute[71474]: Broadwell-IBRS Apr 21 13:53:56 user nova-compute[71474]: Broadwell Apr 21 13:53:56 user nova-compute[71474]: 486 Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: file Apr 21 13:53:56 user nova-compute[71474]: anonymous Apr 21 13:53:56 user nova-compute[71474]: memfd Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: disk Apr 21 13:53:56 user nova-compute[71474]: cdrom Apr 21 13:53:56 user nova-compute[71474]: floppy Apr 21 13:53:56 user nova-compute[71474]: lun Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: ide Apr 21 13:53:56 user nova-compute[71474]: fdc Apr 21 13:53:56 user nova-compute[71474]: scsi Apr 21 13:53:56 user nova-compute[71474]: virtio Apr 21 13:53:56 user nova-compute[71474]: usb Apr 21 13:53:56 user nova-compute[71474]: sata Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: virtio Apr 21 13:53:56 user nova-compute[71474]: virtio-transitional Apr 21 13:53:56 user nova-compute[71474]: virtio-non-transitional Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: sdl Apr 21 13:53:56 user nova-compute[71474]: vnc Apr 21 13:53:56 user nova-compute[71474]: spice Apr 21 13:53:56 user nova-compute[71474]: egl-headless Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: subsystem Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: default Apr 21 13:53:56 user nova-compute[71474]: mandatory Apr 21 13:53:56 user nova-compute[71474]: requisite Apr 21 13:53:56 user nova-compute[71474]: optional Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: usb Apr 21 13:53:56 user nova-compute[71474]: pci Apr 21 13:53:56 user nova-compute[71474]: scsi Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: virtio Apr 21 13:53:56 user nova-compute[71474]: virtio-transitional Apr 21 13:53:56 user nova-compute[71474]: virtio-non-transitional Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: random Apr 21 13:53:56 user nova-compute[71474]: egd Apr 21 13:53:56 user nova-compute[71474]: builtin Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: path Apr 21 13:53:56 user nova-compute[71474]: handle Apr 21 13:53:56 user nova-compute[71474]: virtiofs Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: tpm-tis Apr 21 13:53:56 user nova-compute[71474]: tpm-crb Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: passthrough Apr 21 13:53:56 user nova-compute[71474]: emulator Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: {{(pid=71474) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Apr 21 13:53:56 user nova-compute[71474]: DEBUG nova.virt.libvirt.host [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: /usr/bin/qemu-system-x86_64 Apr 21 13:53:56 user nova-compute[71474]: kvm Apr 21 13:53:56 user nova-compute[71474]: pc-i440fx-6.2 Apr 21 13:53:56 user nova-compute[71474]: x86_64 Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: efi Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: /usr/share/OVMF/OVMF_CODE_4M.fd Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: rom Apr 21 13:53:56 user nova-compute[71474]: pflash Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: yes Apr 21 13:53:56 user nova-compute[71474]: no Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: no Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: on Apr 21 13:53:56 user nova-compute[71474]: off Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: on Apr 21 13:53:56 user nova-compute[71474]: off Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: IvyBridge-IBRS Apr 21 13:53:56 user nova-compute[71474]: Intel Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: qemu64 Apr 21 13:53:56 user nova-compute[71474]: qemu32 Apr 21 13:53:56 user nova-compute[71474]: phenom Apr 21 13:53:56 user nova-compute[71474]: pentium3 Apr 21 13:53:56 user nova-compute[71474]: pentium2 Apr 21 13:53:56 user nova-compute[71474]: pentium Apr 21 13:53:56 user nova-compute[71474]: n270 Apr 21 13:53:56 user nova-compute[71474]: kvm64 Apr 21 13:53:56 user nova-compute[71474]: kvm32 Apr 21 13:53:56 user nova-compute[71474]: coreduo Apr 21 13:53:56 user nova-compute[71474]: core2duo Apr 21 13:53:56 user nova-compute[71474]: athlon Apr 21 13:53:56 user nova-compute[71474]: Westmere-IBRS Apr 21 13:53:56 user nova-compute[71474]: Westmere Apr 21 13:53:56 user nova-compute[71474]: Snowridge Apr 21 13:53:56 user nova-compute[71474]: Skylake-Server-noTSX-IBRS Apr 21 13:53:56 user nova-compute[71474]: Skylake-Server-IBRS Apr 21 13:53:56 user nova-compute[71474]: Skylake-Server Apr 21 13:53:56 user nova-compute[71474]: Skylake-Client-noTSX-IBRS Apr 21 13:53:56 user nova-compute[71474]: Skylake-Client-IBRS Apr 21 13:53:56 user nova-compute[71474]: Skylake-Client Apr 21 13:53:56 user nova-compute[71474]: SandyBridge-IBRS Apr 21 13:53:56 user nova-compute[71474]: SandyBridge Apr 21 13:53:56 user nova-compute[71474]: Penryn Apr 21 13:53:56 user nova-compute[71474]: Opteron_G5 Apr 21 13:53:56 user nova-compute[71474]: Opteron_G4 Apr 21 13:53:56 user nova-compute[71474]: Opteron_G3 Apr 21 13:53:56 user nova-compute[71474]: Opteron_G2 Apr 21 13:53:56 user nova-compute[71474]: Opteron_G1 Apr 21 13:53:56 user nova-compute[71474]: Nehalem-IBRS Apr 21 13:53:56 user nova-compute[71474]: Nehalem Apr 21 13:53:56 user nova-compute[71474]: IvyBridge-IBRS Apr 21 13:53:56 user nova-compute[71474]: IvyBridge Apr 21 13:53:56 user nova-compute[71474]: Icelake-Server-noTSX Apr 21 13:53:56 user nova-compute[71474]: Icelake-Server Apr 21 13:53:56 user nova-compute[71474]: Icelake-Client-noTSX Apr 21 13:53:56 user nova-compute[71474]: Icelake-Client Apr 21 13:53:56 user nova-compute[71474]: Haswell-noTSX-IBRS Apr 21 13:53:56 user nova-compute[71474]: Haswell-noTSX Apr 21 13:53:56 user nova-compute[71474]: Haswell-IBRS Apr 21 13:53:56 user nova-compute[71474]: Haswell Apr 21 13:53:56 user nova-compute[71474]: EPYC-Rome Apr 21 13:53:56 user nova-compute[71474]: EPYC-Milan Apr 21 13:53:56 user nova-compute[71474]: EPYC-IBPB Apr 21 13:53:56 user nova-compute[71474]: EPYC Apr 21 13:53:56 user nova-compute[71474]: Dhyana Apr 21 13:53:56 user nova-compute[71474]: Cooperlake Apr 21 13:53:56 user nova-compute[71474]: Conroe Apr 21 13:53:56 user nova-compute[71474]: Cascadelake-Server-noTSX Apr 21 13:53:56 user nova-compute[71474]: Cascadelake-Server Apr 21 13:53:56 user nova-compute[71474]: Broadwell-noTSX-IBRS Apr 21 13:53:56 user nova-compute[71474]: Broadwell-noTSX Apr 21 13:53:56 user nova-compute[71474]: Broadwell-IBRS Apr 21 13:53:56 user nova-compute[71474]: Broadwell Apr 21 13:53:56 user nova-compute[71474]: 486 Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: file Apr 21 13:53:56 user nova-compute[71474]: anonymous Apr 21 13:53:56 user nova-compute[71474]: memfd Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: disk Apr 21 13:53:56 user nova-compute[71474]: cdrom Apr 21 13:53:56 user nova-compute[71474]: floppy Apr 21 13:53:56 user nova-compute[71474]: lun Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: ide Apr 21 13:53:56 user nova-compute[71474]: fdc Apr 21 13:53:56 user nova-compute[71474]: scsi Apr 21 13:53:56 user nova-compute[71474]: virtio Apr 21 13:53:56 user nova-compute[71474]: usb Apr 21 13:53:56 user nova-compute[71474]: sata Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: virtio Apr 21 13:53:56 user nova-compute[71474]: virtio-transitional Apr 21 13:53:56 user nova-compute[71474]: virtio-non-transitional Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: sdl Apr 21 13:53:56 user nova-compute[71474]: vnc Apr 21 13:53:56 user nova-compute[71474]: spice Apr 21 13:53:56 user nova-compute[71474]: egl-headless Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: subsystem Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: default Apr 21 13:53:56 user nova-compute[71474]: mandatory Apr 21 13:53:56 user nova-compute[71474]: requisite Apr 21 13:53:56 user nova-compute[71474]: optional Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: usb Apr 21 13:53:56 user nova-compute[71474]: pci Apr 21 13:53:56 user nova-compute[71474]: scsi Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: virtio Apr 21 13:53:56 user nova-compute[71474]: virtio-transitional Apr 21 13:53:56 user nova-compute[71474]: virtio-non-transitional Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: random Apr 21 13:53:56 user nova-compute[71474]: egd Apr 21 13:53:56 user nova-compute[71474]: builtin Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: path Apr 21 13:53:56 user nova-compute[71474]: handle Apr 21 13:53:56 user nova-compute[71474]: virtiofs Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: tpm-tis Apr 21 13:53:56 user nova-compute[71474]: tpm-crb Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: passthrough Apr 21 13:53:56 user nova-compute[71474]: emulator Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: {{(pid=71474) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Apr 21 13:53:56 user nova-compute[71474]: DEBUG nova.virt.libvirt.host [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Getting domain capabilities for xtensa via machine types: {None} {{(pid=71474) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 21 13:53:56 user nova-compute[71474]: DEBUG nova.virt.libvirt.host [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Error from libvirt when retrieving domain capabilities for arch xtensa / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-xtensa' on this host {{(pid=71474) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 21 13:53:56 user nova-compute[71474]: DEBUG nova.virt.libvirt.host [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Getting domain capabilities for xtensaeb via machine types: {None} {{(pid=71474) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 21 13:53:56 user nova-compute[71474]: DEBUG nova.virt.libvirt.host [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Error from libvirt when retrieving domain capabilities for arch xtensaeb / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-xtensaeb' on this host {{(pid=71474) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 21 13:53:56 user nova-compute[71474]: DEBUG nova.virt.libvirt.host [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Checking secure boot support for host arch (x86_64) {{(pid=71474) supports_secure_boot /opt/stack/nova/nova/virt/libvirt/host.py:1750}} Apr 21 13:53:56 user nova-compute[71474]: INFO nova.virt.libvirt.host [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Secure Boot support detected Apr 21 13:53:56 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] cpu compare xml: Apr 21 13:53:56 user nova-compute[71474]: Nehalem Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: {{(pid=71474) _compare_cpu /opt/stack/nova/nova/virt/libvirt/driver.py:9996}} Apr 21 13:53:56 user nova-compute[71474]: INFO nova.virt.node [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Generated node identity 4e62c1ab-67bb-43ed-8389-61deb50e98d7 Apr 21 13:53:56 user nova-compute[71474]: INFO nova.virt.node [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Wrote node identity 4e62c1ab-67bb-43ed-8389-61deb50e98d7 to /opt/stack/data/nova/compute_id Apr 21 13:53:56 user nova-compute[71474]: WARNING nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Compute nodes ['4e62c1ab-67bb-43ed-8389-61deb50e98d7'] for host user were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning. Apr 21 13:53:56 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host Apr 21 13:53:56 user nova-compute[71474]: WARNING nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] No compute node record found for host user. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host user could not be found. Apr 21 13:53:56 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:53:56 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:53:56 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:53:56 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Auditing locally available compute resources for user (node: user) {{(pid=71474) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 21 13:53:56 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 13:53:56 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 13:53:56 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Hypervisor/Node resource view: name=user free_ram=10845MB free_disk=26.62847900390625GB free_vcpus=12 pci_devices=[{"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71474) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 21 13:53:56 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:53:56 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:53:56 user nova-compute[71474]: WARNING nova.compute.resource_tracker [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] No compute node record for user:4e62c1ab-67bb-43ed-8389-61deb50e98d7: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 4e62c1ab-67bb-43ed-8389-61deb50e98d7 could not be found. Apr 21 13:53:56 user nova-compute[71474]: INFO nova.compute.resource_tracker [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Compute node record created for user:user with uuid: 4e62c1ab-67bb-43ed-8389-61deb50e98d7 Apr 21 13:53:56 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Total usable vcpus: 12, total allocated vcpus: 0 {{(pid=71474) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 21 13:53:56 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Final resource view: name=user phys_ram=16023MB used_ram=512MB phys_disk=40GB used_disk=0GB total_vcpus=12 used_vcpus=0 pci_stats=[] {{(pid=71474) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 21 13:53:56 user nova-compute[71474]: INFO nova.scheduler.client.report [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [req-4e5b681a-ea87-4ae6-ba0a-417492561f36] Created resource provider record via placement API for resource provider with UUID 4e62c1ab-67bb-43ed-8389-61deb50e98d7 and name user. Apr 21 13:53:56 user nova-compute[71474]: DEBUG nova.virt.libvirt.host [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] /sys/module/kvm_amd/parameters/sev does not exist {{(pid=71474) _kernel_supports_amd_sev /opt/stack/nova/nova/virt/libvirt/host.py:1766}} Apr 21 13:53:56 user nova-compute[71474]: INFO nova.virt.libvirt.host [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] kernel doesn't support AMD SEV Apr 21 13:53:56 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Updating inventory in ProviderTree for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 with inventory: {'MEMORY_MB': {'total': 16023, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 12, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 40, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 0}} {{(pid=71474) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} Apr 21 13:53:56 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71474) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 21 13:53:56 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Libvirt baseline CPU Apr 21 13:53:56 user nova-compute[71474]: x86_64 Apr 21 13:53:56 user nova-compute[71474]: Nehalem Apr 21 13:53:56 user nova-compute[71474]: Intel Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: Apr 21 13:53:56 user nova-compute[71474]: {{(pid=71474) _get_guest_baseline_cpu_features /opt/stack/nova/nova/virt/libvirt/driver.py:12486}} Apr 21 13:53:56 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Updated inventory for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 16023, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 12, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 40, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 0}} {{(pid=71474) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:957}} Apr 21 13:53:56 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Updating resource provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 generation from 0 to 1 during operation: update_inventory {{(pid=71474) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} Apr 21 13:53:56 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Updating inventory in ProviderTree for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 with inventory: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} Apr 21 13:53:57 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Updating resource provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 generation from 1 to 2 during operation: update_traits {{(pid=71474) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} Apr 21 13:53:57 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Compute_service record updated for user:user {{(pid=71474) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 21 13:53:57 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.597s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:53:57 user nova-compute[71474]: DEBUG nova.service [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Creating RPC server for service compute {{(pid=71474) start /opt/stack/nova/nova/service.py:182}} Apr 21 13:53:57 user nova-compute[71474]: DEBUG nova.service [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Join ServiceGroup membership for this service compute {{(pid=71474) start /opt/stack/nova/nova/service.py:199}} Apr 21 13:53:57 user nova-compute[71474]: DEBUG nova.servicegroup.drivers.db [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] DB_Driver: join new ServiceGroup member user to the compute group, service = {{(pid=71474) join /opt/stack/nova/nova/servicegroup/drivers/db.py:44}} Apr 21 13:54:48 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 13:54:48 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 13:54:48 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Starting heal instance info cache {{(pid=71474) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 21 13:54:48 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Rebuilding the list of instances to heal {{(pid=71474) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 21 13:54:48 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Didn't find any instances for network info cache update. {{(pid=71474) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 21 13:54:48 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 13:54:48 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 13:54:48 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 13:54:48 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 13:54:48 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 13:54:48 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._sync_power_states {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 13:54:48 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 13:54:48 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71474) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 21 13:54:48 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 13:54:48 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:54:48 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:54:48 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:54:48 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Auditing locally available compute resources for user (node: user) {{(pid=71474) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 21 13:54:48 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 13:54:48 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 13:54:48 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Hypervisor/Node resource view: name=user free_ram=10281MB free_disk=26.542369842529297GB free_vcpus=12 pci_devices=[{"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71474) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 21 13:54:48 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:54:48 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:54:48 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Total usable vcpus: 12, total allocated vcpus: 0 {{(pid=71474) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 21 13:54:48 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Final resource view: name=user phys_ram=16023MB used_ram=512MB phys_disk=40GB used_disk=0GB total_vcpus=12 used_vcpus=0 pci_stats=[] {{(pid=71474) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 21 13:54:48 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Inventory has not changed in ProviderTree for provider: 4e62c1ab-67bb-43ed-8389-61deb50e98d7 {{(pid=71474) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 13:54:48 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Inventory has not changed for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 13:54:48 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Compute_service record updated for user:user {{(pid=71474) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 21 13:54:48 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.313s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:54:48 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._cleanup_running_deleted_instances {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 13:55:48 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 13:55:48 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 13:55:48 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 13:55:48 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Starting heal instance info cache {{(pid=71474) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 21 13:55:48 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Rebuilding the list of instances to heal {{(pid=71474) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 21 13:55:48 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Didn't find any instances for network info cache update. {{(pid=71474) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 21 13:55:48 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 13:55:48 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 13:55:48 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 13:55:48 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 13:55:48 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 13:55:49 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 13:55:49 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71474) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 21 13:55:49 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 13:55:49 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:55:49 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:55:49 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:55:49 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Auditing locally available compute resources for user (node: user) {{(pid=71474) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 21 13:55:50 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 13:55:50 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 13:55:50 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Hypervisor/Node resource view: name=user free_ram=10227MB free_disk=26.587356567382812GB free_vcpus=12 pci_devices=[{"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71474) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 21 13:55:50 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:55:50 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:55:50 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Total usable vcpus: 12, total allocated vcpus: 0 {{(pid=71474) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 21 13:55:50 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Final resource view: name=user phys_ram=16023MB used_ram=512MB phys_disk=40GB used_disk=0GB total_vcpus=12 used_vcpus=0 pci_stats=[] {{(pid=71474) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 21 13:55:50 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Inventory has not changed in ProviderTree for provider: 4e62c1ab-67bb-43ed-8389-61deb50e98d7 {{(pid=71474) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 13:55:50 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Inventory has not changed for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 13:55:50 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Compute_service record updated for user:user {{(pid=71474) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 21 13:55:50 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.135s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:56:48 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 13:56:48 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Starting heal instance info cache {{(pid=71474) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 21 13:56:48 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Rebuilding the list of instances to heal {{(pid=71474) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 21 13:56:48 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Didn't find any instances for network info cache update. {{(pid=71474) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 21 13:56:48 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 13:56:48 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 13:56:49 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 13:56:49 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 13:56:49 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 13:56:49 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 13:56:49 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71474) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 21 13:56:50 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 13:56:50 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 13:56:50 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:56:50 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:56:50 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:56:50 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Auditing locally available compute resources for user (node: user) {{(pid=71474) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 21 13:56:51 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 13:56:51 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 13:56:51 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Hypervisor/Node resource view: name=user free_ram=10219MB free_disk=26.36661148071289GB free_vcpus=12 pci_devices=[{"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71474) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 21 13:56:51 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:56:51 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:56:51 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Total usable vcpus: 12, total allocated vcpus: 0 {{(pid=71474) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 21 13:56:51 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Final resource view: name=user phys_ram=16023MB used_ram=512MB phys_disk=40GB used_disk=0GB total_vcpus=12 used_vcpus=0 pci_stats=[] {{(pid=71474) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 21 13:56:51 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Inventory has not changed in ProviderTree for provider: 4e62c1ab-67bb-43ed-8389-61deb50e98d7 {{(pid=71474) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 13:56:51 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Inventory has not changed for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 13:56:51 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Compute_service record updated for user:user {{(pid=71474) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 21 13:56:51 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.144s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:57:48 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 13:57:49 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 13:57:49 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 13:57:49 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Starting heal instance info cache {{(pid=71474) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 21 13:57:49 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Rebuilding the list of instances to heal {{(pid=71474) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 21 13:57:49 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Didn't find any instances for network info cache update. {{(pid=71474) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 21 13:57:49 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 13:57:49 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 13:57:50 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 13:57:50 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 13:57:50 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:57:50 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:57:50 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:57:50 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Auditing locally available compute resources for user (node: user) {{(pid=71474) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 21 13:57:51 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 13:57:51 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 13:57:51 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Hypervisor/Node resource view: name=user free_ram=9505MB free_disk=26.395591735839844GB free_vcpus=12 pci_devices=[{"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71474) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 21 13:57:51 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:57:51 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:57:51 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Total usable vcpus: 12, total allocated vcpus: 0 {{(pid=71474) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 21 13:57:51 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Final resource view: name=user phys_ram=16023MB used_ram=512MB phys_disk=40GB used_disk=0GB total_vcpus=12 used_vcpus=0 pci_stats=[] {{(pid=71474) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 21 13:57:51 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Inventory has not changed in ProviderTree for provider: 4e62c1ab-67bb-43ed-8389-61deb50e98d7 {{(pid=71474) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 13:57:51 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Inventory has not changed for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 13:57:51 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Compute_service record updated for user:user {{(pid=71474) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 21 13:57:51 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.159s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:57:52 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 13:57:52 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 13:57:52 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 13:57:52 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71474) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 21 13:58:03 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Acquiring lock "5030decd-cbe5-4495-b497-dfacf25eef73" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:58:03 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Lock "5030decd-cbe5-4495-b497-dfacf25eef73" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:58:03 user nova-compute[71474]: DEBUG nova.compute.manager [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] Starting instance... {{(pid=71474) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 21 13:58:03 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:58:03 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:58:03 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71474) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 21 13:58:03 user nova-compute[71474]: INFO nova.compute.claims [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] Claim successful on node user Apr 21 13:58:04 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Inventory has not changed in ProviderTree for provider: 4e62c1ab-67bb-43ed-8389-61deb50e98d7 {{(pid=71474) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 13:58:04 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Inventory has not changed for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 13:58:04 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.384s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:58:04 user nova-compute[71474]: DEBUG nova.compute.manager [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] Start building networks asynchronously for instance. {{(pid=71474) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 21 13:58:04 user nova-compute[71474]: DEBUG nova.compute.manager [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] Allocating IP information in the background. {{(pid=71474) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 21 13:58:04 user nova-compute[71474]: DEBUG nova.network.neutron [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] allocate_for_instance() {{(pid=71474) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 21 13:58:04 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 21 13:58:04 user nova-compute[71474]: DEBUG nova.compute.manager [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] Start building block device mappings for instance. {{(pid=71474) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 21 13:58:04 user nova-compute[71474]: DEBUG nova.compute.manager [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] Start spawning the instance on the hypervisor. {{(pid=71474) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 21 13:58:04 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] Creating instance directory {{(pid=71474) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 21 13:58:04 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] Creating image(s) Apr 21 13:58:04 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Acquiring lock "/opt/stack/data/nova/instances/5030decd-cbe5-4495-b497-dfacf25eef73/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:58:04 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Lock "/opt/stack/data/nova/instances/5030decd-cbe5-4495-b497-dfacf25eef73/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:58:04 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Lock "/opt/stack/data/nova/instances/5030decd-cbe5-4495-b497-dfacf25eef73/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:58:04 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Acquiring lock "8e8c288cb98f22f6af31ad55f38b7baa81c260d7" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:58:04 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Lock "8e8c288cb98f22f6af31ad55f38b7baa81c260d7" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:58:04 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7.part --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:58:05 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7.part --force-share --output=json" returned: 0 in 0.129s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:58:05 user nova-compute[71474]: DEBUG nova.virt.images [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] 2edfef44-2867-4e03-a53e-b139f99afa75 was qcow2, converting to raw {{(pid=71474) fetch_to_raw /opt/stack/nova/nova/virt/images.py:165}} Apr 21 13:58:05 user nova-compute[71474]: DEBUG nova.privsep.utils [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=71474) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Apr 21 13:58:05 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7.part /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7.converted {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:58:05 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] CMD "qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7.part /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7.converted" returned: 0 in 0.197s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:58:05 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7.converted --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:58:05 user nova-compute[71474]: DEBUG nova.policy [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '113884844de14ec7ac8a20ba06a389b3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '91f5972380fd48eabffd46e6727239ce', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71474) authorize /opt/stack/nova/nova/policy.py:203}} Apr 21 13:58:05 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7.converted --force-share --output=json" returned: 0 in 0.122s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:58:05 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Lock "8e8c288cb98f22f6af31ad55f38b7baa81c260d7" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 0.792s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:58:05 user nova-compute[71474]: INFO oslo.privsep.daemon [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova-cpu.conf', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmpq6hdso6i/privsep.sock'] Apr 21 13:58:05 user sudo[80430]: stack : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/nova-rootwrap /etc/nova/rootwrap.conf privsep-helper --config-file /etc/nova/nova-cpu.conf --privsep_context nova.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpq6hdso6i/privsep.sock Apr 21 13:58:05 user sudo[80430]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1001) Apr 21 13:58:06 user sudo[80430]: pam_unix(sudo:session): session closed for user root Apr 21 13:58:07 user nova-compute[71474]: INFO oslo.privsep.daemon [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Spawned new privsep daemon via rootwrap Apr 21 13:58:07 user nova-compute[71474]: INFO oslo.privsep.daemon [-] privsep daemon starting Apr 21 13:58:07 user nova-compute[71474]: INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0 Apr 21 13:58:07 user nova-compute[71474]: INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none Apr 21 13:58:07 user nova-compute[71474]: INFO oslo.privsep.daemon [-] privsep daemon running as pid 80433 Apr 21 13:58:07 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:58:07 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json" returned: 0 in 0.137s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:58:07 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Acquiring lock "8e8c288cb98f22f6af31ad55f38b7baa81c260d7" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:58:07 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Lock "8e8c288cb98f22f6af31ad55f38b7baa81c260d7" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.003s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:58:07 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:58:07 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json" returned: 0 in 0.137s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:58:07 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7,backing_fmt=raw /opt/stack/data/nova/instances/5030decd-cbe5-4495-b497-dfacf25eef73/disk 1073741824 {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:58:07 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7,backing_fmt=raw /opt/stack/data/nova/instances/5030decd-cbe5-4495-b497-dfacf25eef73/disk 1073741824" returned: 0 in 0.046s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:58:07 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Lock "8e8c288cb98f22f6af31ad55f38b7baa81c260d7" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.189s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:58:07 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:58:07 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json" returned: 0 in 0.142s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:58:07 user nova-compute[71474]: DEBUG nova.virt.disk.api [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Checking if we can resize image /opt/stack/data/nova/instances/5030decd-cbe5-4495-b497-dfacf25eef73/disk. size=1073741824 {{(pid=71474) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 21 13:58:07 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5030decd-cbe5-4495-b497-dfacf25eef73/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:58:07 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5030decd-cbe5-4495-b497-dfacf25eef73/disk --force-share --output=json" returned: 0 in 0.118s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:58:07 user nova-compute[71474]: DEBUG nova.virt.disk.api [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Cannot resize image /opt/stack/data/nova/instances/5030decd-cbe5-4495-b497-dfacf25eef73/disk to a smaller size. {{(pid=71474) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 21 13:58:07 user nova-compute[71474]: DEBUG nova.objects.instance [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Lazy-loading 'migration_context' on Instance uuid 5030decd-cbe5-4495-b497-dfacf25eef73 {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 13:58:07 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] Created local disks {{(pid=71474) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 21 13:58:07 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] Ensure instance console log exists: /opt/stack/data/nova/instances/5030decd-cbe5-4495-b497-dfacf25eef73/console.log {{(pid=71474) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 21 13:58:07 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:58:07 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:58:07 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:58:08 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Acquiring lock "30068c4a-94ed-4b84-9178-0d554326fc68" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:58:08 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Lock "30068c4a-94ed-4b84-9178-0d554326fc68" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:58:08 user nova-compute[71474]: DEBUG nova.compute.manager [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] Starting instance... {{(pid=71474) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 21 13:58:08 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:58:08 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:58:08 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71474) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 21 13:58:08 user nova-compute[71474]: INFO nova.compute.claims [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] Claim successful on node user Apr 21 13:58:08 user nova-compute[71474]: DEBUG nova.network.neutron [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] Successfully created port: ed62554b-cbc2-4c0f-ad1a-821a0625a2e4 {{(pid=71474) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 21 13:58:08 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Inventory has not changed in ProviderTree for provider: 4e62c1ab-67bb-43ed-8389-61deb50e98d7 {{(pid=71474) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 13:58:08 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Inventory has not changed for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 13:58:08 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.242s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:58:08 user nova-compute[71474]: DEBUG nova.compute.manager [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] Start building networks asynchronously for instance. {{(pid=71474) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 21 13:58:08 user nova-compute[71474]: DEBUG nova.compute.manager [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] Allocating IP information in the background. {{(pid=71474) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 21 13:58:08 user nova-compute[71474]: DEBUG nova.network.neutron [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] allocate_for_instance() {{(pid=71474) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 21 13:58:08 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 21 13:58:08 user nova-compute[71474]: DEBUG nova.compute.manager [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] Start building block device mappings for instance. {{(pid=71474) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 21 13:58:08 user nova-compute[71474]: DEBUG nova.compute.manager [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] Start spawning the instance on the hypervisor. {{(pid=71474) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 21 13:58:08 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] Creating instance directory {{(pid=71474) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 21 13:58:08 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] Creating image(s) Apr 21 13:58:08 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Acquiring lock "/opt/stack/data/nova/instances/30068c4a-94ed-4b84-9178-0d554326fc68/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:58:08 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Lock "/opt/stack/data/nova/instances/30068c4a-94ed-4b84-9178-0d554326fc68/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:58:08 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Lock "/opt/stack/data/nova/instances/30068c4a-94ed-4b84-9178-0d554326fc68/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:58:08 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:58:08 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json" returned: 0 in 0.137s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:58:08 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Acquiring lock "8e8c288cb98f22f6af31ad55f38b7baa81c260d7" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:58:08 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Lock "8e8c288cb98f22f6af31ad55f38b7baa81c260d7" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:58:08 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:58:08 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json" returned: 0 in 0.118s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:58:08 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7,backing_fmt=raw /opt/stack/data/nova/instances/30068c4a-94ed-4b84-9178-0d554326fc68/disk 1073741824 {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:58:09 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7,backing_fmt=raw /opt/stack/data/nova/instances/30068c4a-94ed-4b84-9178-0d554326fc68/disk 1073741824" returned: 0 in 0.041s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:58:09 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Lock "8e8c288cb98f22f6af31ad55f38b7baa81c260d7" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.163s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:58:09 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:58:09 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json" returned: 0 in 0.123s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:58:09 user nova-compute[71474]: DEBUG nova.virt.disk.api [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Checking if we can resize image /opt/stack/data/nova/instances/30068c4a-94ed-4b84-9178-0d554326fc68/disk. size=1073741824 {{(pid=71474) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 21 13:58:09 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/30068c4a-94ed-4b84-9178-0d554326fc68/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:58:09 user nova-compute[71474]: DEBUG nova.policy [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2259f365261c49b28b56ddd1c27c125d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a8c210480b33473c91156b798bcbd8b2', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71474) authorize /opt/stack/nova/nova/policy.py:203}} Apr 21 13:58:09 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/30068c4a-94ed-4b84-9178-0d554326fc68/disk --force-share --output=json" returned: 0 in 0.138s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:58:09 user nova-compute[71474]: DEBUG nova.virt.disk.api [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Cannot resize image /opt/stack/data/nova/instances/30068c4a-94ed-4b84-9178-0d554326fc68/disk to a smaller size. {{(pid=71474) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 21 13:58:09 user nova-compute[71474]: DEBUG nova.objects.instance [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Lazy-loading 'migration_context' on Instance uuid 30068c4a-94ed-4b84-9178-0d554326fc68 {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 13:58:09 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] Created local disks {{(pid=71474) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 21 13:58:09 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] Ensure instance console log exists: /opt/stack/data/nova/instances/30068c4a-94ed-4b84-9178-0d554326fc68/console.log {{(pid=71474) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 21 13:58:09 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:58:09 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:58:09 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:58:11 user nova-compute[71474]: DEBUG nova.network.neutron [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] Successfully updated port: ed62554b-cbc2-4c0f-ad1a-821a0625a2e4 {{(pid=71474) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 21 13:58:11 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Acquiring lock "refresh_cache-5030decd-cbe5-4495-b497-dfacf25eef73" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 13:58:11 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Acquired lock "refresh_cache-5030decd-cbe5-4495-b497-dfacf25eef73" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 13:58:11 user nova-compute[71474]: DEBUG nova.network.neutron [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] Building network info cache for instance {{(pid=71474) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 21 13:58:11 user nova-compute[71474]: DEBUG nova.network.neutron [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] Instance cache missing network info. {{(pid=71474) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 21 13:58:11 user nova-compute[71474]: DEBUG nova.compute.manager [req-ef629216-d3ae-4a2e-b261-647bb7637391 req-1ad568c2-4b76-488b-a793-512bb6e9abd2 service nova] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] Received event network-changed-ed62554b-cbc2-4c0f-ad1a-821a0625a2e4 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 13:58:11 user nova-compute[71474]: DEBUG nova.compute.manager [req-ef629216-d3ae-4a2e-b261-647bb7637391 req-1ad568c2-4b76-488b-a793-512bb6e9abd2 service nova] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] Refreshing instance network info cache due to event network-changed-ed62554b-cbc2-4c0f-ad1a-821a0625a2e4. {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 21 13:58:11 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-ef629216-d3ae-4a2e-b261-647bb7637391 req-1ad568c2-4b76-488b-a793-512bb6e9abd2 service nova] Acquiring lock "refresh_cache-5030decd-cbe5-4495-b497-dfacf25eef73" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 13:58:11 user nova-compute[71474]: DEBUG nova.network.neutron [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] Successfully created port: 7361228d-9a8e-4921-9cb8-fc59a0a45063 {{(pid=71474) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 21 13:58:11 user nova-compute[71474]: DEBUG nova.network.neutron [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] Updating instance_info_cache with network_info: [{"id": "ed62554b-cbc2-4c0f-ad1a-821a0625a2e4", "address": "fa:16:3e:e2:cc:bd", "network": {"id": "23a0f330-371d-4fe5-befe-bc4147bf09c7", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-656541543-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "91f5972380fd48eabffd46e6727239ce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "taped62554b-cb", "ovs_interfaceid": "ed62554b-cbc2-4c0f-ad1a-821a0625a2e4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 13:58:11 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Releasing lock "refresh_cache-5030decd-cbe5-4495-b497-dfacf25eef73" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 13:58:11 user nova-compute[71474]: DEBUG nova.compute.manager [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] Instance network_info: |[{"id": "ed62554b-cbc2-4c0f-ad1a-821a0625a2e4", "address": "fa:16:3e:e2:cc:bd", "network": {"id": "23a0f330-371d-4fe5-befe-bc4147bf09c7", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-656541543-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "91f5972380fd48eabffd46e6727239ce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "taped62554b-cb", "ovs_interfaceid": "ed62554b-cbc2-4c0f-ad1a-821a0625a2e4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71474) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 21 13:58:11 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-ef629216-d3ae-4a2e-b261-647bb7637391 req-1ad568c2-4b76-488b-a793-512bb6e9abd2 service nova] Acquired lock "refresh_cache-5030decd-cbe5-4495-b497-dfacf25eef73" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 13:58:11 user nova-compute[71474]: DEBUG nova.network.neutron [req-ef629216-d3ae-4a2e-b261-647bb7637391 req-1ad568c2-4b76-488b-a793-512bb6e9abd2 service nova] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] Refreshing network info cache for port ed62554b-cbc2-4c0f-ad1a-821a0625a2e4 {{(pid=71474) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 21 13:58:12 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] Start _get_guest_xml network_info=[{"id": "ed62554b-cbc2-4c0f-ad1a-821a0625a2e4", "address": "fa:16:3e:e2:cc:bd", "network": {"id": "23a0f330-371d-4fe5-befe-bc4147bf09c7", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-656541543-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "91f5972380fd48eabffd46e6727239ce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "taped62554b-cb", "ovs_interfaceid": "ed62554b-cbc2-4c0f-ad1a-821a0625a2e4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T13:54:16Z,direct_url=,disk_format='qcow2',id=2edfef44-2867-4e03-a53e-b139f99afa75,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='36a44032fda748c1965c722304fa176d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T13:54:18Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'guest_format': None, 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'image_id': '2edfef44-2867-4e03-a53e-b139f99afa75'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71474) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 21 13:58:12 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 13:58:12 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 13:58:12 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71474) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 21 13:58:12 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-21T13:55:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T13:54:16Z,direct_url=,disk_format='qcow2',id=2edfef44-2867-4e03-a53e-b139f99afa75,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='36a44032fda748c1965c722304fa176d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T13:54:18Z,virtual_size=,visibility=), allow threads: True {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 21 13:58:12 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Flavor limits 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 21 13:58:12 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Image limits 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 21 13:58:12 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Flavor pref 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 21 13:58:12 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Image pref 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 21 13:58:12 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 21 13:58:12 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 21 13:58:12 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71474) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 21 13:58:12 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Got 1 possible topologies {{(pid=71474) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 21 13:58:12 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 21 13:58:12 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 21 13:58:12 user nova-compute[71474]: DEBUG nova.privsep.utils [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=71474) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Apr 21 13:58:12 user nova-compute[71474]: DEBUG nova.virt.libvirt.vif [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T13:58:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-872679092',display_name='tempest-AttachVolumeTestJSON-server-872679092',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumetestjson-server-872679092',id=1,image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKmS+b9xlwFpk/ZB8qhPKYszGUSbkqS/wmxdPA2+EZTBrrlLdczt4kqaoNpF+PGGMbYhMksCR0Bk+16nj7FF3bWR1LjQ0yuktGjeeohbe83sc0eREByhabKclSuVNg5vfQ==',key_name='tempest-keypair-1234083121',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='91f5972380fd48eabffd46e6727239ce',ramdisk_id='',reservation_id='r-tfwpbn06',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeTestJSON-1194238008',owner_user_name='tempest-AttachVolumeTestJSON-1194238008-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T13:58:04Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='113884844de14ec7ac8a20ba06a389b3',uuid=5030decd-cbe5-4495-b497-dfacf25eef73,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ed62554b-cbc2-4c0f-ad1a-821a0625a2e4", "address": "fa:16:3e:e2:cc:bd", "network": {"id": "23a0f330-371d-4fe5-befe-bc4147bf09c7", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-656541543-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "91f5972380fd48eabffd46e6727239ce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "taped62554b-cb", "ovs_interfaceid": "ed62554b-cbc2-4c0f-ad1a-821a0625a2e4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71474) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 21 13:58:12 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Converting VIF {"id": "ed62554b-cbc2-4c0f-ad1a-821a0625a2e4", "address": "fa:16:3e:e2:cc:bd", "network": {"id": "23a0f330-371d-4fe5-befe-bc4147bf09c7", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-656541543-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "91f5972380fd48eabffd46e6727239ce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "taped62554b-cb", "ovs_interfaceid": "ed62554b-cbc2-4c0f-ad1a-821a0625a2e4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 13:58:12 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e2:cc:bd,bridge_name='br-int',has_traffic_filtering=True,id=ed62554b-cbc2-4c0f-ad1a-821a0625a2e4,network=Network(23a0f330-371d-4fe5-befe-bc4147bf09c7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='taped62554b-cb') {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 13:58:12 user nova-compute[71474]: DEBUG nova.objects.instance [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Lazy-loading 'pci_devices' on Instance uuid 5030decd-cbe5-4495-b497-dfacf25eef73 {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 13:58:12 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] End _get_guest_xml xml= Apr 21 13:58:12 user nova-compute[71474]: 5030decd-cbe5-4495-b497-dfacf25eef73 Apr 21 13:58:12 user nova-compute[71474]: instance-00000001 Apr 21 13:58:12 user nova-compute[71474]: 131072 Apr 21 13:58:12 user nova-compute[71474]: 1 Apr 21 13:58:12 user nova-compute[71474]: Apr 21 13:58:12 user nova-compute[71474]: Apr 21 13:58:12 user nova-compute[71474]: Apr 21 13:58:12 user nova-compute[71474]: tempest-AttachVolumeTestJSON-server-872679092 Apr 21 13:58:12 user nova-compute[71474]: 2023-04-21 13:58:12 Apr 21 13:58:12 user nova-compute[71474]: Apr 21 13:58:12 user nova-compute[71474]: 128 Apr 21 13:58:12 user nova-compute[71474]: 1 Apr 21 13:58:12 user nova-compute[71474]: 0 Apr 21 13:58:12 user nova-compute[71474]: 0 Apr 21 13:58:12 user nova-compute[71474]: 1 Apr 21 13:58:12 user nova-compute[71474]: Apr 21 13:58:12 user nova-compute[71474]: Apr 21 13:58:12 user nova-compute[71474]: tempest-AttachVolumeTestJSON-1194238008-project-member Apr 21 13:58:12 user nova-compute[71474]: tempest-AttachVolumeTestJSON-1194238008 Apr 21 13:58:12 user nova-compute[71474]: Apr 21 13:58:12 user nova-compute[71474]: Apr 21 13:58:12 user nova-compute[71474]: Apr 21 13:58:12 user nova-compute[71474]: Apr 21 13:58:12 user nova-compute[71474]: Apr 21 13:58:12 user nova-compute[71474]: Apr 21 13:58:12 user nova-compute[71474]: Apr 21 13:58:12 user nova-compute[71474]: Apr 21 13:58:12 user nova-compute[71474]: Apr 21 13:58:12 user nova-compute[71474]: Apr 21 13:58:12 user nova-compute[71474]: Apr 21 13:58:12 user nova-compute[71474]: OpenStack Foundation Apr 21 13:58:12 user nova-compute[71474]: OpenStack Nova Apr 21 13:58:12 user nova-compute[71474]: 0.0.0 Apr 21 13:58:12 user nova-compute[71474]: 5030decd-cbe5-4495-b497-dfacf25eef73 Apr 21 13:58:12 user nova-compute[71474]: 5030decd-cbe5-4495-b497-dfacf25eef73 Apr 21 13:58:12 user nova-compute[71474]: Virtual Machine Apr 21 13:58:12 user nova-compute[71474]: Apr 21 13:58:12 user nova-compute[71474]: Apr 21 13:58:12 user nova-compute[71474]: Apr 21 13:58:12 user nova-compute[71474]: hvm Apr 21 13:58:12 user nova-compute[71474]: Apr 21 13:58:12 user nova-compute[71474]: Apr 21 13:58:12 user nova-compute[71474]: Apr 21 13:58:12 user nova-compute[71474]: Apr 21 13:58:12 user nova-compute[71474]: Apr 21 13:58:12 user nova-compute[71474]: Apr 21 13:58:12 user nova-compute[71474]: Apr 21 13:58:12 user nova-compute[71474]: Apr 21 13:58:12 user nova-compute[71474]: Apr 21 13:58:12 user nova-compute[71474]: Apr 21 13:58:12 user nova-compute[71474]: Apr 21 13:58:12 user nova-compute[71474]: Apr 21 13:58:12 user nova-compute[71474]: Apr 21 13:58:12 user nova-compute[71474]: Apr 21 13:58:12 user nova-compute[71474]: Nehalem Apr 21 13:58:12 user nova-compute[71474]: Apr 21 13:58:12 user nova-compute[71474]: Apr 21 13:58:12 user nova-compute[71474]: Apr 21 13:58:12 user nova-compute[71474]: Apr 21 13:58:12 user nova-compute[71474]: Apr 21 13:58:12 user nova-compute[71474]: Apr 21 13:58:12 user nova-compute[71474]: Apr 21 13:58:12 user nova-compute[71474]: Apr 21 13:58:12 user nova-compute[71474]: Apr 21 13:58:12 user nova-compute[71474]: Apr 21 13:58:12 user nova-compute[71474]: Apr 21 13:58:12 user nova-compute[71474]: Apr 21 13:58:12 user nova-compute[71474]: Apr 21 13:58:12 user nova-compute[71474]: Apr 21 13:58:12 user nova-compute[71474]: Apr 21 13:58:12 user nova-compute[71474]: Apr 21 13:58:12 user nova-compute[71474]: Apr 21 13:58:12 user nova-compute[71474]: Apr 21 13:58:12 user nova-compute[71474]: Apr 21 13:58:12 user nova-compute[71474]: Apr 21 13:58:12 user nova-compute[71474]: /dev/urandom Apr 21 13:58:12 user nova-compute[71474]: Apr 21 13:58:12 user nova-compute[71474]: Apr 21 13:58:12 user nova-compute[71474]: Apr 21 13:58:12 user nova-compute[71474]: Apr 21 13:58:12 user nova-compute[71474]: Apr 21 13:58:12 user nova-compute[71474]: Apr 21 13:58:12 user nova-compute[71474]: Apr 21 13:58:12 user nova-compute[71474]: {{(pid=71474) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 21 13:58:12 user nova-compute[71474]: DEBUG nova.virt.libvirt.vif [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T13:58:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-872679092',display_name='tempest-AttachVolumeTestJSON-server-872679092',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumetestjson-server-872679092',id=1,image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKmS+b9xlwFpk/ZB8qhPKYszGUSbkqS/wmxdPA2+EZTBrrlLdczt4kqaoNpF+PGGMbYhMksCR0Bk+16nj7FF3bWR1LjQ0yuktGjeeohbe83sc0eREByhabKclSuVNg5vfQ==',key_name='tempest-keypair-1234083121',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='91f5972380fd48eabffd46e6727239ce',ramdisk_id='',reservation_id='r-tfwpbn06',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeTestJSON-1194238008',owner_user_name='tempest-AttachVolumeTestJSON-1194238008-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T13:58:04Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='113884844de14ec7ac8a20ba06a389b3',uuid=5030decd-cbe5-4495-b497-dfacf25eef73,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ed62554b-cbc2-4c0f-ad1a-821a0625a2e4", "address": "fa:16:3e:e2:cc:bd", "network": {"id": "23a0f330-371d-4fe5-befe-bc4147bf09c7", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-656541543-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "91f5972380fd48eabffd46e6727239ce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "taped62554b-cb", "ovs_interfaceid": "ed62554b-cbc2-4c0f-ad1a-821a0625a2e4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 21 13:58:12 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Converting VIF {"id": "ed62554b-cbc2-4c0f-ad1a-821a0625a2e4", "address": "fa:16:3e:e2:cc:bd", "network": {"id": "23a0f330-371d-4fe5-befe-bc4147bf09c7", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-656541543-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "91f5972380fd48eabffd46e6727239ce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "taped62554b-cb", "ovs_interfaceid": "ed62554b-cbc2-4c0f-ad1a-821a0625a2e4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 13:58:12 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e2:cc:bd,bridge_name='br-int',has_traffic_filtering=True,id=ed62554b-cbc2-4c0f-ad1a-821a0625a2e4,network=Network(23a0f330-371d-4fe5-befe-bc4147bf09c7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='taped62554b-cb') {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 13:58:12 user nova-compute[71474]: DEBUG os_vif [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e2:cc:bd,bridge_name='br-int',has_traffic_filtering=True,id=ed62554b-cbc2-4c0f-ad1a-821a0625a2e4,network=Network(23a0f330-371d-4fe5-befe-bc4147bf09c7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='taped62554b-cb') {{(pid=71474) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 21 13:58:12 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Created schema index Interface.name {{(pid=71474) autocreate_indices /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/__init__.py:106}} Apr 21 13:58:12 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Created schema index Port.name {{(pid=71474) autocreate_indices /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/__init__.py:106}} Apr 21 13:58:12 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Created schema index Bridge.name {{(pid=71474) autocreate_indices /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/__init__.py:106}} Apr 21 13:58:12 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] tcp:127.0.0.1:6640: entering CONNECTING {{(pid=71474) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 13:58:12 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] [POLLOUT] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:12 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71474) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 13:58:12 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:12 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:12 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:12 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:12 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 13:58:12 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 21 13:58:12 user nova-compute[71474]: INFO oslo.privsep.daemon [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova-cpu.conf', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpke8rlrjn/privsep.sock'] Apr 21 13:58:12 user sudo[80479]: stack : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/nova-rootwrap /etc/nova/rootwrap.conf privsep-helper --config-file /etc/nova/nova-cpu.conf --privsep_context vif_plug_ovs.privsep.vif_plug --privsep_sock_path /tmp/tmpke8rlrjn/privsep.sock Apr 21 13:58:12 user sudo[80479]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1001) Apr 21 13:58:13 user nova-compute[71474]: DEBUG nova.network.neutron [req-ef629216-d3ae-4a2e-b261-647bb7637391 req-1ad568c2-4b76-488b-a793-512bb6e9abd2 service nova] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] Updated VIF entry in instance network info cache for port ed62554b-cbc2-4c0f-ad1a-821a0625a2e4. {{(pid=71474) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 21 13:58:13 user nova-compute[71474]: DEBUG nova.network.neutron [req-ef629216-d3ae-4a2e-b261-647bb7637391 req-1ad568c2-4b76-488b-a793-512bb6e9abd2 service nova] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] Updating instance_info_cache with network_info: [{"id": "ed62554b-cbc2-4c0f-ad1a-821a0625a2e4", "address": "fa:16:3e:e2:cc:bd", "network": {"id": "23a0f330-371d-4fe5-befe-bc4147bf09c7", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-656541543-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "91f5972380fd48eabffd46e6727239ce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "taped62554b-cb", "ovs_interfaceid": "ed62554b-cbc2-4c0f-ad1a-821a0625a2e4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 13:58:13 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-ef629216-d3ae-4a2e-b261-647bb7637391 req-1ad568c2-4b76-488b-a793-512bb6e9abd2 service nova] Releasing lock "refresh_cache-5030decd-cbe5-4495-b497-dfacf25eef73" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 13:58:13 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:13 user sudo[80479]: pam_unix(sudo:session): session closed for user root Apr 21 13:58:13 user nova-compute[71474]: INFO oslo.privsep.daemon [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Spawned new privsep daemon via rootwrap Apr 21 13:58:13 user nova-compute[71474]: INFO oslo.privsep.daemon [-] privsep daemon starting Apr 21 13:58:13 user nova-compute[71474]: INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0 Apr 21 13:58:13 user nova-compute[71474]: INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none Apr 21 13:58:13 user nova-compute[71474]: INFO oslo.privsep.daemon [-] privsep daemon running as pid 80482 Apr 21 13:58:14 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:14 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=taped62554b-cb, may_exist=True) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 13:58:14 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=taped62554b-cb, col_values=(('external_ids', {'iface-id': 'ed62554b-cbc2-4c0f-ad1a-821a0625a2e4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e2:cc:bd', 'vm-uuid': '5030decd-cbe5-4495-b497-dfacf25eef73'}),)) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 13:58:14 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:14 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 13:58:14 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:14 user nova-compute[71474]: INFO os_vif [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e2:cc:bd,bridge_name='br-int',has_traffic_filtering=True,id=ed62554b-cbc2-4c0f-ad1a-821a0625a2e4,network=Network(23a0f330-371d-4fe5-befe-bc4147bf09c7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='taped62554b-cb') Apr 21 13:58:14 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] No BDM found with device name vda, not building metadata. {{(pid=71474) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 21 13:58:14 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] No VIF found with MAC fa:16:3e:e2:cc:bd, not building metadata {{(pid=71474) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 21 13:58:14 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Acquiring lock "3af27bc9-9617-44c7-bfa4-993b347d183c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:58:14 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Lock "3af27bc9-9617-44c7-bfa4-993b347d183c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:58:14 user nova-compute[71474]: DEBUG nova.compute.manager [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] [instance: 3af27bc9-9617-44c7-bfa4-993b347d183c] Starting instance... {{(pid=71474) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 21 13:58:14 user nova-compute[71474]: DEBUG nova.network.neutron [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] Successfully updated port: 7361228d-9a8e-4921-9cb8-fc59a0a45063 {{(pid=71474) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 21 13:58:14 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Acquiring lock "refresh_cache-30068c4a-94ed-4b84-9178-0d554326fc68" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 13:58:14 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Acquired lock "refresh_cache-30068c4a-94ed-4b84-9178-0d554326fc68" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 13:58:14 user nova-compute[71474]: DEBUG nova.network.neutron [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] Building network info cache for instance {{(pid=71474) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 21 13:58:14 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:58:14 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:58:14 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71474) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 21 13:58:14 user nova-compute[71474]: INFO nova.compute.claims [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] [instance: 3af27bc9-9617-44c7-bfa4-993b347d183c] Claim successful on node user Apr 21 13:58:14 user nova-compute[71474]: DEBUG nova.network.neutron [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] Instance cache missing network info. {{(pid=71474) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 21 13:58:14 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:15 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:15 user nova-compute[71474]: DEBUG nova.network.neutron [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] Updating instance_info_cache with network_info: [{"id": "7361228d-9a8e-4921-9cb8-fc59a0a45063", "address": "fa:16:3e:3c:01:3d", "network": {"id": "d567294b-c36b-4268-af90-17560e0c43e4", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1033838809-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a8c210480b33473c91156b798bcbd8b2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7361228d-9a", "ovs_interfaceid": "7361228d-9a8e-4921-9cb8-fc59a0a45063", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 13:58:15 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Inventory has not changed in ProviderTree for provider: 4e62c1ab-67bb-43ed-8389-61deb50e98d7 {{(pid=71474) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 13:58:15 user nova-compute[71474]: DEBUG nova.compute.manager [req-94919cc0-3609-4cec-a75a-da318d464e87 req-50e91cff-0571-459c-8434-ca5943581982 service nova] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] Received event network-changed-7361228d-9a8e-4921-9cb8-fc59a0a45063 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 13:58:15 user nova-compute[71474]: DEBUG nova.compute.manager [req-94919cc0-3609-4cec-a75a-da318d464e87 req-50e91cff-0571-459c-8434-ca5943581982 service nova] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] Refreshing instance network info cache due to event network-changed-7361228d-9a8e-4921-9cb8-fc59a0a45063. {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 21 13:58:15 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-94919cc0-3609-4cec-a75a-da318d464e87 req-50e91cff-0571-459c-8434-ca5943581982 service nova] Acquiring lock "refresh_cache-30068c4a-94ed-4b84-9178-0d554326fc68" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 13:58:15 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Releasing lock "refresh_cache-30068c4a-94ed-4b84-9178-0d554326fc68" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 13:58:15 user nova-compute[71474]: DEBUG nova.compute.manager [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] Instance network_info: |[{"id": "7361228d-9a8e-4921-9cb8-fc59a0a45063", "address": "fa:16:3e:3c:01:3d", "network": {"id": "d567294b-c36b-4268-af90-17560e0c43e4", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1033838809-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a8c210480b33473c91156b798bcbd8b2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7361228d-9a", "ovs_interfaceid": "7361228d-9a8e-4921-9cb8-fc59a0a45063", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71474) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 21 13:58:15 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-94919cc0-3609-4cec-a75a-da318d464e87 req-50e91cff-0571-459c-8434-ca5943581982 service nova] Acquired lock "refresh_cache-30068c4a-94ed-4b84-9178-0d554326fc68" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 13:58:15 user nova-compute[71474]: DEBUG nova.network.neutron [req-94919cc0-3609-4cec-a75a-da318d464e87 req-50e91cff-0571-459c-8434-ca5943581982 service nova] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] Refreshing network info cache for port 7361228d-9a8e-4921-9cb8-fc59a0a45063 {{(pid=71474) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 21 13:58:15 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] Start _get_guest_xml network_info=[{"id": "7361228d-9a8e-4921-9cb8-fc59a0a45063", "address": "fa:16:3e:3c:01:3d", "network": {"id": "d567294b-c36b-4268-af90-17560e0c43e4", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1033838809-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a8c210480b33473c91156b798bcbd8b2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7361228d-9a", "ovs_interfaceid": "7361228d-9a8e-4921-9cb8-fc59a0a45063", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T13:54:16Z,direct_url=,disk_format='qcow2',id=2edfef44-2867-4e03-a53e-b139f99afa75,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='36a44032fda748c1965c722304fa176d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T13:54:18Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'guest_format': None, 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'image_id': '2edfef44-2867-4e03-a53e-b139f99afa75'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71474) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 21 13:58:15 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Inventory has not changed for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 13:58:15 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 13:58:15 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 13:58:15 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71474) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 21 13:58:15 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-21T13:55:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T13:54:16Z,direct_url=,disk_format='qcow2',id=2edfef44-2867-4e03-a53e-b139f99afa75,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='36a44032fda748c1965c722304fa176d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T13:54:18Z,virtual_size=,visibility=), allow threads: True {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 21 13:58:15 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Flavor limits 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 21 13:58:15 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Image limits 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 21 13:58:15 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Flavor pref 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 21 13:58:15 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Image pref 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 21 13:58:15 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 21 13:58:15 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 21 13:58:15 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71474) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 21 13:58:15 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Got 1 possible topologies {{(pid=71474) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 21 13:58:15 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 21 13:58:15 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 21 13:58:15 user nova-compute[71474]: DEBUG nova.virt.libvirt.vif [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T13:58:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1791477557',display_name='tempest-ServersNegativeTestJSON-server-1791477557',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serversnegativetestjson-server-1791477557',id=2,image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a8c210480b33473c91156b798bcbd8b2',ramdisk_id='',reservation_id='r-0bp8i0q4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-1552178734',owner_user_name='tempest-ServersNegativeTestJSON-1552178734-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T13:58:09Z,user_data=None,user_id='2259f365261c49b28b56ddd1c27c125d',uuid=30068c4a-94ed-4b84-9178-0d554326fc68,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7361228d-9a8e-4921-9cb8-fc59a0a45063", "address": "fa:16:3e:3c:01:3d", "network": {"id": "d567294b-c36b-4268-af90-17560e0c43e4", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1033838809-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a8c210480b33473c91156b798bcbd8b2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7361228d-9a", "ovs_interfaceid": "7361228d-9a8e-4921-9cb8-fc59a0a45063", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71474) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 21 13:58:15 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Converting VIF {"id": "7361228d-9a8e-4921-9cb8-fc59a0a45063", "address": "fa:16:3e:3c:01:3d", "network": {"id": "d567294b-c36b-4268-af90-17560e0c43e4", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1033838809-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a8c210480b33473c91156b798bcbd8b2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7361228d-9a", "ovs_interfaceid": "7361228d-9a8e-4921-9cb8-fc59a0a45063", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 13:58:15 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3c:01:3d,bridge_name='br-int',has_traffic_filtering=True,id=7361228d-9a8e-4921-9cb8-fc59a0a45063,network=Network(d567294b-c36b-4268-af90-17560e0c43e4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7361228d-9a') {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 13:58:15 user nova-compute[71474]: DEBUG nova.objects.instance [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Lazy-loading 'pci_devices' on Instance uuid 30068c4a-94ed-4b84-9178-0d554326fc68 {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 13:58:15 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.692s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:58:15 user nova-compute[71474]: DEBUG nova.compute.manager [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] [instance: 3af27bc9-9617-44c7-bfa4-993b347d183c] Start building networks asynchronously for instance. {{(pid=71474) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 21 13:58:15 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] End _get_guest_xml xml= Apr 21 13:58:15 user nova-compute[71474]: 30068c4a-94ed-4b84-9178-0d554326fc68 Apr 21 13:58:15 user nova-compute[71474]: instance-00000002 Apr 21 13:58:15 user nova-compute[71474]: 131072 Apr 21 13:58:15 user nova-compute[71474]: 1 Apr 21 13:58:15 user nova-compute[71474]: Apr 21 13:58:15 user nova-compute[71474]: Apr 21 13:58:15 user nova-compute[71474]: Apr 21 13:58:15 user nova-compute[71474]: tempest-ServersNegativeTestJSON-server-1791477557 Apr 21 13:58:15 user nova-compute[71474]: 2023-04-21 13:58:15 Apr 21 13:58:15 user nova-compute[71474]: Apr 21 13:58:15 user nova-compute[71474]: 128 Apr 21 13:58:15 user nova-compute[71474]: 1 Apr 21 13:58:15 user nova-compute[71474]: 0 Apr 21 13:58:15 user nova-compute[71474]: 0 Apr 21 13:58:15 user nova-compute[71474]: 1 Apr 21 13:58:15 user nova-compute[71474]: Apr 21 13:58:15 user nova-compute[71474]: Apr 21 13:58:15 user nova-compute[71474]: tempest-ServersNegativeTestJSON-1552178734-project-member Apr 21 13:58:15 user nova-compute[71474]: tempest-ServersNegativeTestJSON-1552178734 Apr 21 13:58:15 user nova-compute[71474]: Apr 21 13:58:15 user nova-compute[71474]: Apr 21 13:58:15 user nova-compute[71474]: Apr 21 13:58:15 user nova-compute[71474]: Apr 21 13:58:15 user nova-compute[71474]: Apr 21 13:58:15 user nova-compute[71474]: Apr 21 13:58:15 user nova-compute[71474]: Apr 21 13:58:15 user nova-compute[71474]: Apr 21 13:58:15 user nova-compute[71474]: Apr 21 13:58:15 user nova-compute[71474]: Apr 21 13:58:15 user nova-compute[71474]: Apr 21 13:58:15 user nova-compute[71474]: OpenStack Foundation Apr 21 13:58:15 user nova-compute[71474]: OpenStack Nova Apr 21 13:58:15 user nova-compute[71474]: 0.0.0 Apr 21 13:58:15 user nova-compute[71474]: 30068c4a-94ed-4b84-9178-0d554326fc68 Apr 21 13:58:15 user nova-compute[71474]: 30068c4a-94ed-4b84-9178-0d554326fc68 Apr 21 13:58:15 user nova-compute[71474]: Virtual Machine Apr 21 13:58:15 user nova-compute[71474]: Apr 21 13:58:15 user nova-compute[71474]: Apr 21 13:58:15 user nova-compute[71474]: Apr 21 13:58:15 user nova-compute[71474]: hvm Apr 21 13:58:15 user nova-compute[71474]: Apr 21 13:58:15 user nova-compute[71474]: Apr 21 13:58:15 user nova-compute[71474]: Apr 21 13:58:15 user nova-compute[71474]: Apr 21 13:58:15 user nova-compute[71474]: Apr 21 13:58:15 user nova-compute[71474]: Apr 21 13:58:15 user nova-compute[71474]: Apr 21 13:58:15 user nova-compute[71474]: Apr 21 13:58:15 user nova-compute[71474]: Apr 21 13:58:15 user nova-compute[71474]: Apr 21 13:58:15 user nova-compute[71474]: Apr 21 13:58:15 user nova-compute[71474]: Apr 21 13:58:15 user nova-compute[71474]: Apr 21 13:58:15 user nova-compute[71474]: Apr 21 13:58:15 user nova-compute[71474]: Nehalem Apr 21 13:58:15 user nova-compute[71474]: Apr 21 13:58:15 user nova-compute[71474]: Apr 21 13:58:15 user nova-compute[71474]: Apr 21 13:58:15 user nova-compute[71474]: Apr 21 13:58:15 user nova-compute[71474]: Apr 21 13:58:15 user nova-compute[71474]: Apr 21 13:58:15 user nova-compute[71474]: Apr 21 13:58:15 user nova-compute[71474]: Apr 21 13:58:15 user nova-compute[71474]: Apr 21 13:58:15 user nova-compute[71474]: Apr 21 13:58:15 user nova-compute[71474]: Apr 21 13:58:15 user nova-compute[71474]: Apr 21 13:58:15 user nova-compute[71474]: Apr 21 13:58:15 user nova-compute[71474]: Apr 21 13:58:15 user nova-compute[71474]: Apr 21 13:58:15 user nova-compute[71474]: Apr 21 13:58:15 user nova-compute[71474]: Apr 21 13:58:15 user nova-compute[71474]: Apr 21 13:58:15 user nova-compute[71474]: Apr 21 13:58:15 user nova-compute[71474]: Apr 21 13:58:15 user nova-compute[71474]: /dev/urandom Apr 21 13:58:15 user nova-compute[71474]: Apr 21 13:58:15 user nova-compute[71474]: Apr 21 13:58:15 user nova-compute[71474]: Apr 21 13:58:15 user nova-compute[71474]: Apr 21 13:58:15 user nova-compute[71474]: Apr 21 13:58:15 user nova-compute[71474]: Apr 21 13:58:15 user nova-compute[71474]: Apr 21 13:58:15 user nova-compute[71474]: {{(pid=71474) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 21 13:58:15 user nova-compute[71474]: DEBUG nova.virt.libvirt.vif [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T13:58:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1791477557',display_name='tempest-ServersNegativeTestJSON-server-1791477557',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serversnegativetestjson-server-1791477557',id=2,image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a8c210480b33473c91156b798bcbd8b2',ramdisk_id='',reservation_id='r-0bp8i0q4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-1552178734',owner_user_name='tempest-ServersNegativeTestJSON-1552178734-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T13:58:09Z,user_data=None,user_id='2259f365261c49b28b56ddd1c27c125d',uuid=30068c4a-94ed-4b84-9178-0d554326fc68,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7361228d-9a8e-4921-9cb8-fc59a0a45063", "address": "fa:16:3e:3c:01:3d", "network": {"id": "d567294b-c36b-4268-af90-17560e0c43e4", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1033838809-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a8c210480b33473c91156b798bcbd8b2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7361228d-9a", "ovs_interfaceid": "7361228d-9a8e-4921-9cb8-fc59a0a45063", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 21 13:58:15 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Converting VIF {"id": "7361228d-9a8e-4921-9cb8-fc59a0a45063", "address": "fa:16:3e:3c:01:3d", "network": {"id": "d567294b-c36b-4268-af90-17560e0c43e4", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1033838809-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a8c210480b33473c91156b798bcbd8b2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7361228d-9a", "ovs_interfaceid": "7361228d-9a8e-4921-9cb8-fc59a0a45063", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 13:58:15 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3c:01:3d,bridge_name='br-int',has_traffic_filtering=True,id=7361228d-9a8e-4921-9cb8-fc59a0a45063,network=Network(d567294b-c36b-4268-af90-17560e0c43e4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7361228d-9a') {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 13:58:15 user nova-compute[71474]: DEBUG os_vif [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3c:01:3d,bridge_name='br-int',has_traffic_filtering=True,id=7361228d-9a8e-4921-9cb8-fc59a0a45063,network=Network(d567294b-c36b-4268-af90-17560e0c43e4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7361228d-9a') {{(pid=71474) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 21 13:58:15 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:15 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 13:58:15 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 21 13:58:15 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:15 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7361228d-9a, may_exist=True) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 13:58:15 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7361228d-9a, col_values=(('external_ids', {'iface-id': '7361228d-9a8e-4921-9cb8-fc59a0a45063', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:3c:01:3d', 'vm-uuid': '30068c4a-94ed-4b84-9178-0d554326fc68'}),)) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 13:58:15 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:15 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 13:58:15 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:15 user nova-compute[71474]: INFO os_vif [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3c:01:3d,bridge_name='br-int',has_traffic_filtering=True,id=7361228d-9a8e-4921-9cb8-fc59a0a45063,network=Network(d567294b-c36b-4268-af90-17560e0c43e4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7361228d-9a') Apr 21 13:58:15 user nova-compute[71474]: DEBUG nova.compute.manager [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] [instance: 3af27bc9-9617-44c7-bfa4-993b347d183c] Allocating IP information in the background. {{(pid=71474) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 21 13:58:15 user nova-compute[71474]: DEBUG nova.network.neutron [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] [instance: 3af27bc9-9617-44c7-bfa4-993b347d183c] allocate_for_instance() {{(pid=71474) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 21 13:58:15 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] No BDM found with device name vda, not building metadata. {{(pid=71474) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 21 13:58:15 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] No VIF found with MAC fa:16:3e:3c:01:3d, not building metadata {{(pid=71474) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 21 13:58:15 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] [instance: 3af27bc9-9617-44c7-bfa4-993b347d183c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 21 13:58:15 user nova-compute[71474]: DEBUG nova.compute.manager [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] [instance: 3af27bc9-9617-44c7-bfa4-993b347d183c] Start building block device mappings for instance. {{(pid=71474) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 21 13:58:15 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:15 user nova-compute[71474]: DEBUG nova.compute.manager [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] [instance: 3af27bc9-9617-44c7-bfa4-993b347d183c] Start spawning the instance on the hypervisor. {{(pid=71474) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 21 13:58:15 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] [instance: 3af27bc9-9617-44c7-bfa4-993b347d183c] Creating instance directory {{(pid=71474) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 21 13:58:15 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] [instance: 3af27bc9-9617-44c7-bfa4-993b347d183c] Creating image(s) Apr 21 13:58:15 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Acquiring lock "/opt/stack/data/nova/instances/3af27bc9-9617-44c7-bfa4-993b347d183c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:58:15 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Lock "/opt/stack/data/nova/instances/3af27bc9-9617-44c7-bfa4-993b347d183c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:58:15 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Lock "/opt/stack/data/nova/instances/3af27bc9-9617-44c7-bfa4-993b347d183c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:58:15 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:58:15 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:15 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:15 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:15 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:15 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:15 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json" returned: 0 in 0.143s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:58:15 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Acquiring lock "8e8c288cb98f22f6af31ad55f38b7baa81c260d7" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:58:15 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Lock "8e8c288cb98f22f6af31ad55f38b7baa81c260d7" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:58:15 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:58:16 user nova-compute[71474]: DEBUG nova.policy [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '92c19bad528a4c38860a43913b28b85b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8a8fedc10f324a92aef4142ab7efdd6a', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71474) authorize /opt/stack/nova/nova/policy.py:203}} Apr 21 13:58:16 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json" returned: 0 in 0.155s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:58:16 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7,backing_fmt=raw /opt/stack/data/nova/instances/3af27bc9-9617-44c7-bfa4-993b347d183c/disk 1073741824 {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:58:16 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7,backing_fmt=raw /opt/stack/data/nova/instances/3af27bc9-9617-44c7-bfa4-993b347d183c/disk 1073741824" returned: 0 in 0.056s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:58:16 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Lock "8e8c288cb98f22f6af31ad55f38b7baa81c260d7" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.215s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:58:16 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:58:16 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json" returned: 0 in 0.134s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:58:16 user nova-compute[71474]: DEBUG nova.virt.disk.api [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Checking if we can resize image /opt/stack/data/nova/instances/3af27bc9-9617-44c7-bfa4-993b347d183c/disk. size=1073741824 {{(pid=71474) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 21 13:58:16 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/3af27bc9-9617-44c7-bfa4-993b347d183c/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:58:16 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/3af27bc9-9617-44c7-bfa4-993b347d183c/disk --force-share --output=json" returned: 0 in 0.120s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:58:16 user nova-compute[71474]: DEBUG nova.virt.disk.api [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Cannot resize image /opt/stack/data/nova/instances/3af27bc9-9617-44c7-bfa4-993b347d183c/disk to a smaller size. {{(pid=71474) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 21 13:58:16 user nova-compute[71474]: DEBUG nova.objects.instance [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Lazy-loading 'migration_context' on Instance uuid 3af27bc9-9617-44c7-bfa4-993b347d183c {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 13:58:16 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] [instance: 3af27bc9-9617-44c7-bfa4-993b347d183c] Created local disks {{(pid=71474) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 21 13:58:16 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] [instance: 3af27bc9-9617-44c7-bfa4-993b347d183c] Ensure instance console log exists: /opt/stack/data/nova/instances/3af27bc9-9617-44c7-bfa4-993b347d183c/console.log {{(pid=71474) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 21 13:58:16 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:58:16 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:58:16 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:58:16 user nova-compute[71474]: DEBUG nova.network.neutron [req-94919cc0-3609-4cec-a75a-da318d464e87 req-50e91cff-0571-459c-8434-ca5943581982 service nova] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] Updated VIF entry in instance network info cache for port 7361228d-9a8e-4921-9cb8-fc59a0a45063. {{(pid=71474) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 21 13:58:16 user nova-compute[71474]: DEBUG nova.network.neutron [req-94919cc0-3609-4cec-a75a-da318d464e87 req-50e91cff-0571-459c-8434-ca5943581982 service nova] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] Updating instance_info_cache with network_info: [{"id": "7361228d-9a8e-4921-9cb8-fc59a0a45063", "address": "fa:16:3e:3c:01:3d", "network": {"id": "d567294b-c36b-4268-af90-17560e0c43e4", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1033838809-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a8c210480b33473c91156b798bcbd8b2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7361228d-9a", "ovs_interfaceid": "7361228d-9a8e-4921-9cb8-fc59a0a45063", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 13:58:16 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-94919cc0-3609-4cec-a75a-da318d464e87 req-50e91cff-0571-459c-8434-ca5943581982 service nova] Releasing lock "refresh_cache-30068c4a-94ed-4b84-9178-0d554326fc68" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 13:58:17 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:17 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:17 user nova-compute[71474]: DEBUG nova.compute.manager [req-94b909ce-2c44-4a11-83aa-ef6f2ae1fb09 req-b1174765-d494-473c-9dc4-6c28e07f0b52 service nova] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] Received event network-vif-plugged-ed62554b-cbc2-4c0f-ad1a-821a0625a2e4 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 13:58:17 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-94b909ce-2c44-4a11-83aa-ef6f2ae1fb09 req-b1174765-d494-473c-9dc4-6c28e07f0b52 service nova] Acquiring lock "5030decd-cbe5-4495-b497-dfacf25eef73-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:58:17 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-94b909ce-2c44-4a11-83aa-ef6f2ae1fb09 req-b1174765-d494-473c-9dc4-6c28e07f0b52 service nova] Lock "5030decd-cbe5-4495-b497-dfacf25eef73-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:58:17 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-94b909ce-2c44-4a11-83aa-ef6f2ae1fb09 req-b1174765-d494-473c-9dc4-6c28e07f0b52 service nova] Lock "5030decd-cbe5-4495-b497-dfacf25eef73-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:58:17 user nova-compute[71474]: DEBUG nova.compute.manager [req-94b909ce-2c44-4a11-83aa-ef6f2ae1fb09 req-b1174765-d494-473c-9dc4-6c28e07f0b52 service nova] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] No waiting events found dispatching network-vif-plugged-ed62554b-cbc2-4c0f-ad1a-821a0625a2e4 {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 13:58:17 user nova-compute[71474]: WARNING nova.compute.manager [req-94b909ce-2c44-4a11-83aa-ef6f2ae1fb09 req-b1174765-d494-473c-9dc4-6c28e07f0b52 service nova] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] Received unexpected event network-vif-plugged-ed62554b-cbc2-4c0f-ad1a-821a0625a2e4 for instance with vm_state building and task_state spawning. Apr 21 13:58:17 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] Acquiring lock "2c5afe45-87ae-477a-8bf0-6a5e2036fb68" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:58:17 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] Lock "2c5afe45-87ae-477a-8bf0-6a5e2036fb68" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:58:17 user nova-compute[71474]: DEBUG nova.compute.manager [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] Starting instance... {{(pid=71474) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 21 13:58:17 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:58:17 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:58:17 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71474) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 21 13:58:17 user nova-compute[71474]: INFO nova.compute.claims [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] Claim successful on node user Apr 21 13:58:18 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] Inventory has not changed in ProviderTree for provider: 4e62c1ab-67bb-43ed-8389-61deb50e98d7 {{(pid=71474) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 13:58:18 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] Inventory has not changed for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 13:58:18 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.310s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:58:18 user nova-compute[71474]: DEBUG nova.compute.manager [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] Start building networks asynchronously for instance. {{(pid=71474) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 21 13:58:18 user nova-compute[71474]: DEBUG nova.compute.manager [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] Allocating IP information in the background. {{(pid=71474) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 21 13:58:18 user nova-compute[71474]: DEBUG nova.network.neutron [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] allocate_for_instance() {{(pid=71474) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 21 13:58:18 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 21 13:58:18 user nova-compute[71474]: DEBUG nova.compute.manager [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] Start building block device mappings for instance. {{(pid=71474) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 21 13:58:18 user nova-compute[71474]: DEBUG nova.network.neutron [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] [instance: 3af27bc9-9617-44c7-bfa4-993b347d183c] Successfully created port: 7e465dfa-8ae6-4806-b18d-e23dcbf0a97d {{(pid=71474) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 21 13:58:18 user nova-compute[71474]: DEBUG nova.virt.driver [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Emitting event Resumed> {{(pid=71474) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 13:58:18 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] VM Resumed (Lifecycle Event) Apr 21 13:58:18 user nova-compute[71474]: DEBUG nova.compute.manager [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] Start spawning the instance on the hypervisor. {{(pid=71474) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 21 13:58:18 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] Creating instance directory {{(pid=71474) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 21 13:58:18 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] Creating image(s) Apr 21 13:58:18 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] Acquiring lock "/opt/stack/data/nova/instances/2c5afe45-87ae-477a-8bf0-6a5e2036fb68/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:58:18 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] Lock "/opt/stack/data/nova/instances/2c5afe45-87ae-477a-8bf0-6a5e2036fb68/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:58:18 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] Lock "/opt/stack/data/nova/instances/2c5afe45-87ae-477a-8bf0-6a5e2036fb68/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.003s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:58:18 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:58:18 user nova-compute[71474]: DEBUG nova.compute.manager [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] Instance event wait completed in 0 seconds for {{(pid=71474) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 21 13:58:18 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] Guest created on hypervisor {{(pid=71474) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 21 13:58:18 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 13:58:18 user nova-compute[71474]: INFO nova.virt.libvirt.driver [-] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] Instance spawned successfully. Apr 21 13:58:18 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 21 13:58:18 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71474) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 13:58:18 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] Found default for hw_cdrom_bus of ide {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 13:58:18 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] Found default for hw_disk_bus of virtio {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 13:58:18 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] Found default for hw_input_bus of None {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 13:58:18 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] Found default for hw_pointer_model of None {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 13:58:18 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] Found default for hw_video_model of virtio {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 13:58:18 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] Found default for hw_vif_model of virtio {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 13:58:18 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 13:58:18 user nova-compute[71474]: DEBUG nova.virt.driver [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Emitting event Started> {{(pid=71474) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 13:58:18 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] VM Started (Lifecycle Event) Apr 21 13:58:18 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json" returned: 0 in 0.143s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:58:18 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] Acquiring lock "8e8c288cb98f22f6af31ad55f38b7baa81c260d7" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:58:18 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] Lock "8e8c288cb98f22f6af31ad55f38b7baa81c260d7" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:58:18 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:58:18 user nova-compute[71474]: DEBUG nova.policy [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0d7d1e7446af4edf8e35a9d0178b2895', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b4c4270d6dfa435f94da018d12586bcd', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71474) authorize /opt/stack/nova/nova/policy.py:203}} Apr 21 13:58:18 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 13:58:18 user nova-compute[71474]: INFO nova.compute.manager [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] Took 14.06 seconds to spawn the instance on the hypervisor. Apr 21 13:58:18 user nova-compute[71474]: DEBUG nova.compute.manager [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 13:58:18 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71474) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 13:58:18 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 13:58:18 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json" returned: 0 in 0.165s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:58:18 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7,backing_fmt=raw /opt/stack/data/nova/instances/2c5afe45-87ae-477a-8bf0-6a5e2036fb68/disk 1073741824 {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:58:18 user nova-compute[71474]: INFO nova.compute.manager [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] Took 15.00 seconds to build instance. Apr 21 13:58:18 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7,backing_fmt=raw /opt/stack/data/nova/instances/2c5afe45-87ae-477a-8bf0-6a5e2036fb68/disk 1073741824" returned: 0 in 0.070s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:58:18 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] Lock "8e8c288cb98f22f6af31ad55f38b7baa81c260d7" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.238s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:58:18 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:58:18 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-ccc096dc-6294-43c3-8d29-79d5ea855c6a tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Lock "5030decd-cbe5-4495-b497-dfacf25eef73" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 15.252s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:58:18 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json" returned: 0 in 0.147s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:58:18 user nova-compute[71474]: DEBUG nova.virt.disk.api [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] Checking if we can resize image /opt/stack/data/nova/instances/2c5afe45-87ae-477a-8bf0-6a5e2036fb68/disk. size=1073741824 {{(pid=71474) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 21 13:58:18 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2c5afe45-87ae-477a-8bf0-6a5e2036fb68/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:58:19 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2c5afe45-87ae-477a-8bf0-6a5e2036fb68/disk --force-share --output=json" returned: 0 in 0.164s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:58:19 user nova-compute[71474]: DEBUG nova.virt.disk.api [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] Cannot resize image /opt/stack/data/nova/instances/2c5afe45-87ae-477a-8bf0-6a5e2036fb68/disk to a smaller size. {{(pid=71474) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 21 13:58:19 user nova-compute[71474]: DEBUG nova.objects.instance [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] Lazy-loading 'migration_context' on Instance uuid 2c5afe45-87ae-477a-8bf0-6a5e2036fb68 {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 13:58:19 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] Created local disks {{(pid=71474) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 21 13:58:19 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] Ensure instance console log exists: /opt/stack/data/nova/instances/2c5afe45-87ae-477a-8bf0-6a5e2036fb68/console.log {{(pid=71474) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 21 13:58:19 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:58:19 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:58:19 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:58:19 user nova-compute[71474]: DEBUG nova.virt.driver [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Emitting event Resumed> {{(pid=71474) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 13:58:19 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] VM Resumed (Lifecycle Event) Apr 21 13:58:19 user nova-compute[71474]: DEBUG nova.compute.manager [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] Instance event wait completed in 0 seconds for {{(pid=71474) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 21 13:58:19 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] Guest created on hypervisor {{(pid=71474) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 21 13:58:19 user nova-compute[71474]: INFO nova.virt.libvirt.driver [-] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] Instance spawned successfully. Apr 21 13:58:19 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 21 13:58:19 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 13:58:19 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71474) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 13:58:19 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] Found default for hw_cdrom_bus of ide {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 13:58:19 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] Found default for hw_disk_bus of virtio {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 13:58:19 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] Found default for hw_input_bus of None {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 13:58:19 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] Found default for hw_pointer_model of None {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 13:58:19 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] Found default for hw_video_model of virtio {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 13:58:19 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] Found default for hw_vif_model of virtio {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 13:58:19 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 13:58:19 user nova-compute[71474]: DEBUG nova.virt.driver [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Emitting event Started> {{(pid=71474) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 13:58:19 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] VM Started (Lifecycle Event) Apr 21 13:58:19 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 13:58:19 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71474) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 13:58:19 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 13:58:19 user nova-compute[71474]: INFO nova.compute.manager [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] Took 10.69 seconds to spawn the instance on the hypervisor. Apr 21 13:58:19 user nova-compute[71474]: DEBUG nova.compute.manager [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 13:58:19 user nova-compute[71474]: INFO nova.compute.manager [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] Took 11.25 seconds to build instance. Apr 21 13:58:19 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-62f86bb9-4153-428a-a29e-db14c94d2acb tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Lock "30068c4a-94ed-4b84-9178-0d554326fc68" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 11.380s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:58:19 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:19 user nova-compute[71474]: DEBUG nova.compute.manager [req-15037ced-8d60-4afa-84e0-ace82e2db0dc req-c5d054f7-a78c-45d5-9768-cc39a57f9fb5 service nova] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] Received event network-vif-plugged-ed62554b-cbc2-4c0f-ad1a-821a0625a2e4 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 13:58:19 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-15037ced-8d60-4afa-84e0-ace82e2db0dc req-c5d054f7-a78c-45d5-9768-cc39a57f9fb5 service nova] Acquiring lock "5030decd-cbe5-4495-b497-dfacf25eef73-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:58:19 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-15037ced-8d60-4afa-84e0-ace82e2db0dc req-c5d054f7-a78c-45d5-9768-cc39a57f9fb5 service nova] Lock "5030decd-cbe5-4495-b497-dfacf25eef73-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:58:19 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-15037ced-8d60-4afa-84e0-ace82e2db0dc req-c5d054f7-a78c-45d5-9768-cc39a57f9fb5 service nova] Lock "5030decd-cbe5-4495-b497-dfacf25eef73-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:58:19 user nova-compute[71474]: DEBUG nova.compute.manager [req-15037ced-8d60-4afa-84e0-ace82e2db0dc req-c5d054f7-a78c-45d5-9768-cc39a57f9fb5 service nova] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] No waiting events found dispatching network-vif-plugged-ed62554b-cbc2-4c0f-ad1a-821a0625a2e4 {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 13:58:19 user nova-compute[71474]: WARNING nova.compute.manager [req-15037ced-8d60-4afa-84e0-ace82e2db0dc req-c5d054f7-a78c-45d5-9768-cc39a57f9fb5 service nova] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] Received unexpected event network-vif-plugged-ed62554b-cbc2-4c0f-ad1a-821a0625a2e4 for instance with vm_state active and task_state None. Apr 21 13:58:19 user nova-compute[71474]: DEBUG nova.compute.manager [req-15037ced-8d60-4afa-84e0-ace82e2db0dc req-c5d054f7-a78c-45d5-9768-cc39a57f9fb5 service nova] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] Received event network-vif-plugged-7361228d-9a8e-4921-9cb8-fc59a0a45063 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 13:58:19 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-15037ced-8d60-4afa-84e0-ace82e2db0dc req-c5d054f7-a78c-45d5-9768-cc39a57f9fb5 service nova] Acquiring lock "30068c4a-94ed-4b84-9178-0d554326fc68-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:58:19 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-15037ced-8d60-4afa-84e0-ace82e2db0dc req-c5d054f7-a78c-45d5-9768-cc39a57f9fb5 service nova] Lock "30068c4a-94ed-4b84-9178-0d554326fc68-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:58:19 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-15037ced-8d60-4afa-84e0-ace82e2db0dc req-c5d054f7-a78c-45d5-9768-cc39a57f9fb5 service nova] Lock "30068c4a-94ed-4b84-9178-0d554326fc68-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:58:19 user nova-compute[71474]: DEBUG nova.compute.manager [req-15037ced-8d60-4afa-84e0-ace82e2db0dc req-c5d054f7-a78c-45d5-9768-cc39a57f9fb5 service nova] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] No waiting events found dispatching network-vif-plugged-7361228d-9a8e-4921-9cb8-fc59a0a45063 {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 13:58:19 user nova-compute[71474]: WARNING nova.compute.manager [req-15037ced-8d60-4afa-84e0-ace82e2db0dc req-c5d054f7-a78c-45d5-9768-cc39a57f9fb5 service nova] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] Received unexpected event network-vif-plugged-7361228d-9a8e-4921-9cb8-fc59a0a45063 for instance with vm_state active and task_state None. Apr 21 13:58:19 user nova-compute[71474]: DEBUG nova.compute.manager [req-15037ced-8d60-4afa-84e0-ace82e2db0dc req-c5d054f7-a78c-45d5-9768-cc39a57f9fb5 service nova] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] Received event network-vif-plugged-7361228d-9a8e-4921-9cb8-fc59a0a45063 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 13:58:19 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-15037ced-8d60-4afa-84e0-ace82e2db0dc req-c5d054f7-a78c-45d5-9768-cc39a57f9fb5 service nova] Acquiring lock "30068c4a-94ed-4b84-9178-0d554326fc68-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:58:19 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-15037ced-8d60-4afa-84e0-ace82e2db0dc req-c5d054f7-a78c-45d5-9768-cc39a57f9fb5 service nova] Lock "30068c4a-94ed-4b84-9178-0d554326fc68-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:58:19 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-15037ced-8d60-4afa-84e0-ace82e2db0dc req-c5d054f7-a78c-45d5-9768-cc39a57f9fb5 service nova] Lock "30068c4a-94ed-4b84-9178-0d554326fc68-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:58:19 user nova-compute[71474]: DEBUG nova.compute.manager [req-15037ced-8d60-4afa-84e0-ace82e2db0dc req-c5d054f7-a78c-45d5-9768-cc39a57f9fb5 service nova] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] No waiting events found dispatching network-vif-plugged-7361228d-9a8e-4921-9cb8-fc59a0a45063 {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 13:58:19 user nova-compute[71474]: WARNING nova.compute.manager [req-15037ced-8d60-4afa-84e0-ace82e2db0dc req-c5d054f7-a78c-45d5-9768-cc39a57f9fb5 service nova] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] Received unexpected event network-vif-plugged-7361228d-9a8e-4921-9cb8-fc59a0a45063 for instance with vm_state active and task_state None. Apr 21 13:58:19 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:20 user nova-compute[71474]: DEBUG nova.network.neutron [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] [instance: 3af27bc9-9617-44c7-bfa4-993b347d183c] Successfully updated port: 7e465dfa-8ae6-4806-b18d-e23dcbf0a97d {{(pid=71474) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 21 13:58:20 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Acquiring lock "refresh_cache-3af27bc9-9617-44c7-bfa4-993b347d183c" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 13:58:20 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Acquired lock "refresh_cache-3af27bc9-9617-44c7-bfa4-993b347d183c" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 13:58:20 user nova-compute[71474]: DEBUG nova.network.neutron [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] [instance: 3af27bc9-9617-44c7-bfa4-993b347d183c] Building network info cache for instance {{(pid=71474) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 21 13:58:20 user nova-compute[71474]: DEBUG nova.network.neutron [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] [instance: 3af27bc9-9617-44c7-bfa4-993b347d183c] Instance cache missing network info. {{(pid=71474) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 21 13:58:20 user nova-compute[71474]: DEBUG nova.compute.manager [req-a1df6cef-89fa-4e75-b8e2-6a1cb2d948d4 req-5afb44c8-9c41-49f3-87a8-11844da88d2b service nova] [instance: 3af27bc9-9617-44c7-bfa4-993b347d183c] Received event network-changed-7e465dfa-8ae6-4806-b18d-e23dcbf0a97d {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 13:58:20 user nova-compute[71474]: DEBUG nova.compute.manager [req-a1df6cef-89fa-4e75-b8e2-6a1cb2d948d4 req-5afb44c8-9c41-49f3-87a8-11844da88d2b service nova] [instance: 3af27bc9-9617-44c7-bfa4-993b347d183c] Refreshing instance network info cache due to event network-changed-7e465dfa-8ae6-4806-b18d-e23dcbf0a97d. {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 21 13:58:20 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-a1df6cef-89fa-4e75-b8e2-6a1cb2d948d4 req-5afb44c8-9c41-49f3-87a8-11844da88d2b service nova] Acquiring lock "refresh_cache-3af27bc9-9617-44c7-bfa4-993b347d183c" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 13:58:20 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:20 user nova-compute[71474]: DEBUG nova.network.neutron [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] [instance: 3af27bc9-9617-44c7-bfa4-993b347d183c] Updating instance_info_cache with network_info: [{"id": "7e465dfa-8ae6-4806-b18d-e23dcbf0a97d", "address": "fa:16:3e:da:eb:fa", "network": {"id": "1815b48e-38a4-4a83-a23b-d7c2ce38a2c3", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1971948253-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8a8fedc10f324a92aef4142ab7efdd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e465dfa-8a", "ovs_interfaceid": "7e465dfa-8ae6-4806-b18d-e23dcbf0a97d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 13:58:20 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:20 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Releasing lock "refresh_cache-3af27bc9-9617-44c7-bfa4-993b347d183c" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 13:58:20 user nova-compute[71474]: DEBUG nova.compute.manager [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] [instance: 3af27bc9-9617-44c7-bfa4-993b347d183c] Instance network_info: |[{"id": "7e465dfa-8ae6-4806-b18d-e23dcbf0a97d", "address": "fa:16:3e:da:eb:fa", "network": {"id": "1815b48e-38a4-4a83-a23b-d7c2ce38a2c3", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1971948253-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8a8fedc10f324a92aef4142ab7efdd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e465dfa-8a", "ovs_interfaceid": "7e465dfa-8ae6-4806-b18d-e23dcbf0a97d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71474) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 21 13:58:20 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-a1df6cef-89fa-4e75-b8e2-6a1cb2d948d4 req-5afb44c8-9c41-49f3-87a8-11844da88d2b service nova] Acquired lock "refresh_cache-3af27bc9-9617-44c7-bfa4-993b347d183c" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 13:58:20 user nova-compute[71474]: DEBUG nova.network.neutron [req-a1df6cef-89fa-4e75-b8e2-6a1cb2d948d4 req-5afb44c8-9c41-49f3-87a8-11844da88d2b service nova] [instance: 3af27bc9-9617-44c7-bfa4-993b347d183c] Refreshing network info cache for port 7e465dfa-8ae6-4806-b18d-e23dcbf0a97d {{(pid=71474) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 21 13:58:20 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] [instance: 3af27bc9-9617-44c7-bfa4-993b347d183c] Start _get_guest_xml network_info=[{"id": "7e465dfa-8ae6-4806-b18d-e23dcbf0a97d", "address": "fa:16:3e:da:eb:fa", "network": {"id": "1815b48e-38a4-4a83-a23b-d7c2ce38a2c3", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1971948253-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8a8fedc10f324a92aef4142ab7efdd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e465dfa-8a", "ovs_interfaceid": "7e465dfa-8ae6-4806-b18d-e23dcbf0a97d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T13:54:16Z,direct_url=,disk_format='qcow2',id=2edfef44-2867-4e03-a53e-b139f99afa75,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='36a44032fda748c1965c722304fa176d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T13:54:18Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'guest_format': None, 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'image_id': '2edfef44-2867-4e03-a53e-b139f99afa75'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71474) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 21 13:58:20 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 13:58:20 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 13:58:20 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71474) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 21 13:58:20 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-21T13:55:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T13:54:16Z,direct_url=,disk_format='qcow2',id=2edfef44-2867-4e03-a53e-b139f99afa75,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='36a44032fda748c1965c722304fa176d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T13:54:18Z,virtual_size=,visibility=), allow threads: True {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 21 13:58:20 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Flavor limits 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 21 13:58:20 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Image limits 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 21 13:58:20 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Flavor pref 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 21 13:58:20 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Image pref 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 21 13:58:20 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 21 13:58:20 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 21 13:58:20 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71474) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 21 13:58:20 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Got 1 possible topologies {{(pid=71474) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 21 13:58:20 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 21 13:58:20 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 21 13:58:20 user nova-compute[71474]: DEBUG nova.virt.libvirt.vif [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T13:58:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-1376655679',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumeshelvetestjson-server-1376655679',id=3,image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBSfCm4kaW8kKhEYk26quHUV0I+U7qCvNO4t2+x3r4VBiTtZqyaoR2EGTBvf5XEnA51qy75jGfzROX158wPVnPuLv0NMr8g0Jge5rbAgJLAI/7LXIqSONczhV0yG5arkdQ==',key_name='tempest-keypair-2134167482',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8a8fedc10f324a92aef4142ab7efdd6a',ramdisk_id='',reservation_id='r-n1llw0zy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeShelveTestJSON-2115713901',owner_user_name='tempest-AttachVolumeShelveTestJSON-2115713901-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T13:58:16Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='92c19bad528a4c38860a43913b28b85b',uuid=3af27bc9-9617-44c7-bfa4-993b347d183c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7e465dfa-8ae6-4806-b18d-e23dcbf0a97d", "address": "fa:16:3e:da:eb:fa", "network": {"id": "1815b48e-38a4-4a83-a23b-d7c2ce38a2c3", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1971948253-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8a8fedc10f324a92aef4142ab7efdd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e465dfa-8a", "ovs_interfaceid": "7e465dfa-8ae6-4806-b18d-e23dcbf0a97d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71474) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 21 13:58:20 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Converting VIF {"id": "7e465dfa-8ae6-4806-b18d-e23dcbf0a97d", "address": "fa:16:3e:da:eb:fa", "network": {"id": "1815b48e-38a4-4a83-a23b-d7c2ce38a2c3", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1971948253-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8a8fedc10f324a92aef4142ab7efdd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e465dfa-8a", "ovs_interfaceid": "7e465dfa-8ae6-4806-b18d-e23dcbf0a97d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 13:58:20 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:da:eb:fa,bridge_name='br-int',has_traffic_filtering=True,id=7e465dfa-8ae6-4806-b18d-e23dcbf0a97d,network=Network(1815b48e-38a4-4a83-a23b-d7c2ce38a2c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7e465dfa-8a') {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 13:58:20 user nova-compute[71474]: DEBUG nova.objects.instance [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Lazy-loading 'pci_devices' on Instance uuid 3af27bc9-9617-44c7-bfa4-993b347d183c {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 13:58:20 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] [instance: 3af27bc9-9617-44c7-bfa4-993b347d183c] End _get_guest_xml xml= Apr 21 13:58:20 user nova-compute[71474]: 3af27bc9-9617-44c7-bfa4-993b347d183c Apr 21 13:58:20 user nova-compute[71474]: instance-00000003 Apr 21 13:58:20 user nova-compute[71474]: 131072 Apr 21 13:58:20 user nova-compute[71474]: 1 Apr 21 13:58:20 user nova-compute[71474]: Apr 21 13:58:20 user nova-compute[71474]: Apr 21 13:58:20 user nova-compute[71474]: Apr 21 13:58:20 user nova-compute[71474]: tempest-AttachVolumeShelveTestJSON-server-1376655679 Apr 21 13:58:20 user nova-compute[71474]: 2023-04-21 13:58:20 Apr 21 13:58:20 user nova-compute[71474]: Apr 21 13:58:20 user nova-compute[71474]: 128 Apr 21 13:58:20 user nova-compute[71474]: 1 Apr 21 13:58:20 user nova-compute[71474]: 0 Apr 21 13:58:20 user nova-compute[71474]: 0 Apr 21 13:58:20 user nova-compute[71474]: 1 Apr 21 13:58:20 user nova-compute[71474]: Apr 21 13:58:20 user nova-compute[71474]: Apr 21 13:58:20 user nova-compute[71474]: tempest-AttachVolumeShelveTestJSON-2115713901-project-member Apr 21 13:58:20 user nova-compute[71474]: tempest-AttachVolumeShelveTestJSON-2115713901 Apr 21 13:58:20 user nova-compute[71474]: Apr 21 13:58:20 user nova-compute[71474]: Apr 21 13:58:20 user nova-compute[71474]: Apr 21 13:58:20 user nova-compute[71474]: Apr 21 13:58:20 user nova-compute[71474]: Apr 21 13:58:20 user nova-compute[71474]: Apr 21 13:58:20 user nova-compute[71474]: Apr 21 13:58:20 user nova-compute[71474]: Apr 21 13:58:20 user nova-compute[71474]: Apr 21 13:58:20 user nova-compute[71474]: Apr 21 13:58:20 user nova-compute[71474]: Apr 21 13:58:20 user nova-compute[71474]: OpenStack Foundation Apr 21 13:58:20 user nova-compute[71474]: OpenStack Nova Apr 21 13:58:20 user nova-compute[71474]: 0.0.0 Apr 21 13:58:20 user nova-compute[71474]: 3af27bc9-9617-44c7-bfa4-993b347d183c Apr 21 13:58:20 user nova-compute[71474]: 3af27bc9-9617-44c7-bfa4-993b347d183c Apr 21 13:58:20 user nova-compute[71474]: Virtual Machine Apr 21 13:58:20 user nova-compute[71474]: Apr 21 13:58:20 user nova-compute[71474]: Apr 21 13:58:20 user nova-compute[71474]: Apr 21 13:58:20 user nova-compute[71474]: hvm Apr 21 13:58:20 user nova-compute[71474]: Apr 21 13:58:20 user nova-compute[71474]: Apr 21 13:58:20 user nova-compute[71474]: Apr 21 13:58:20 user nova-compute[71474]: Apr 21 13:58:20 user nova-compute[71474]: Apr 21 13:58:20 user nova-compute[71474]: Apr 21 13:58:20 user nova-compute[71474]: Apr 21 13:58:20 user nova-compute[71474]: Apr 21 13:58:20 user nova-compute[71474]: Apr 21 13:58:20 user nova-compute[71474]: Apr 21 13:58:20 user nova-compute[71474]: Apr 21 13:58:20 user nova-compute[71474]: Apr 21 13:58:20 user nova-compute[71474]: Apr 21 13:58:20 user nova-compute[71474]: Apr 21 13:58:20 user nova-compute[71474]: Nehalem Apr 21 13:58:20 user nova-compute[71474]: Apr 21 13:58:20 user nova-compute[71474]: Apr 21 13:58:20 user nova-compute[71474]: Apr 21 13:58:20 user nova-compute[71474]: Apr 21 13:58:20 user nova-compute[71474]: Apr 21 13:58:20 user nova-compute[71474]: Apr 21 13:58:20 user nova-compute[71474]: Apr 21 13:58:20 user nova-compute[71474]: Apr 21 13:58:20 user nova-compute[71474]: Apr 21 13:58:20 user nova-compute[71474]: Apr 21 13:58:20 user nova-compute[71474]: Apr 21 13:58:20 user nova-compute[71474]: Apr 21 13:58:20 user nova-compute[71474]: Apr 21 13:58:20 user nova-compute[71474]: Apr 21 13:58:20 user nova-compute[71474]: Apr 21 13:58:20 user nova-compute[71474]: Apr 21 13:58:20 user nova-compute[71474]: Apr 21 13:58:20 user nova-compute[71474]: Apr 21 13:58:20 user nova-compute[71474]: Apr 21 13:58:20 user nova-compute[71474]: Apr 21 13:58:20 user nova-compute[71474]: /dev/urandom Apr 21 13:58:20 user nova-compute[71474]: Apr 21 13:58:20 user nova-compute[71474]: Apr 21 13:58:20 user nova-compute[71474]: Apr 21 13:58:20 user nova-compute[71474]: Apr 21 13:58:20 user nova-compute[71474]: Apr 21 13:58:20 user nova-compute[71474]: Apr 21 13:58:20 user nova-compute[71474]: Apr 21 13:58:20 user nova-compute[71474]: {{(pid=71474) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 21 13:58:20 user nova-compute[71474]: DEBUG nova.virt.libvirt.vif [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T13:58:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-1376655679',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumeshelvetestjson-server-1376655679',id=3,image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBSfCm4kaW8kKhEYk26quHUV0I+U7qCvNO4t2+x3r4VBiTtZqyaoR2EGTBvf5XEnA51qy75jGfzROX158wPVnPuLv0NMr8g0Jge5rbAgJLAI/7LXIqSONczhV0yG5arkdQ==',key_name='tempest-keypair-2134167482',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8a8fedc10f324a92aef4142ab7efdd6a',ramdisk_id='',reservation_id='r-n1llw0zy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeShelveTestJSON-2115713901',owner_user_name='tempest-AttachVolumeShelveTestJSON-2115713901-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T13:58:16Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='92c19bad528a4c38860a43913b28b85b',uuid=3af27bc9-9617-44c7-bfa4-993b347d183c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7e465dfa-8ae6-4806-b18d-e23dcbf0a97d", "address": "fa:16:3e:da:eb:fa", "network": {"id": "1815b48e-38a4-4a83-a23b-d7c2ce38a2c3", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1971948253-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8a8fedc10f324a92aef4142ab7efdd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e465dfa-8a", "ovs_interfaceid": "7e465dfa-8ae6-4806-b18d-e23dcbf0a97d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 21 13:58:20 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Converting VIF {"id": "7e465dfa-8ae6-4806-b18d-e23dcbf0a97d", "address": "fa:16:3e:da:eb:fa", "network": {"id": "1815b48e-38a4-4a83-a23b-d7c2ce38a2c3", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1971948253-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8a8fedc10f324a92aef4142ab7efdd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e465dfa-8a", "ovs_interfaceid": "7e465dfa-8ae6-4806-b18d-e23dcbf0a97d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 13:58:20 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:da:eb:fa,bridge_name='br-int',has_traffic_filtering=True,id=7e465dfa-8ae6-4806-b18d-e23dcbf0a97d,network=Network(1815b48e-38a4-4a83-a23b-d7c2ce38a2c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7e465dfa-8a') {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 13:58:20 user nova-compute[71474]: DEBUG os_vif [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:da:eb:fa,bridge_name='br-int',has_traffic_filtering=True,id=7e465dfa-8ae6-4806-b18d-e23dcbf0a97d,network=Network(1815b48e-38a4-4a83-a23b-d7c2ce38a2c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7e465dfa-8a') {{(pid=71474) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 21 13:58:20 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:20 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 13:58:20 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 21 13:58:20 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:20 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7e465dfa-8a, may_exist=True) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 13:58:20 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7e465dfa-8a, col_values=(('external_ids', {'iface-id': '7e465dfa-8ae6-4806-b18d-e23dcbf0a97d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:da:eb:fa', 'vm-uuid': '3af27bc9-9617-44c7-bfa4-993b347d183c'}),)) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 13:58:20 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:20 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 13:58:20 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:20 user nova-compute[71474]: INFO os_vif [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:da:eb:fa,bridge_name='br-int',has_traffic_filtering=True,id=7e465dfa-8ae6-4806-b18d-e23dcbf0a97d,network=Network(1815b48e-38a4-4a83-a23b-d7c2ce38a2c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7e465dfa-8a') Apr 21 13:58:21 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] No BDM found with device name vda, not building metadata. {{(pid=71474) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 21 13:58:21 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] No VIF found with MAC fa:16:3e:da:eb:fa, not building metadata {{(pid=71474) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 21 13:58:21 user nova-compute[71474]: DEBUG nova.network.neutron [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] Successfully created port: 2616f5a4-1b53-44bd-82ad-65419e2839ca {{(pid=71474) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 21 13:58:21 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:21 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:22 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:22 user nova-compute[71474]: DEBUG nova.network.neutron [req-a1df6cef-89fa-4e75-b8e2-6a1cb2d948d4 req-5afb44c8-9c41-49f3-87a8-11844da88d2b service nova] [instance: 3af27bc9-9617-44c7-bfa4-993b347d183c] Updated VIF entry in instance network info cache for port 7e465dfa-8ae6-4806-b18d-e23dcbf0a97d. {{(pid=71474) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 21 13:58:22 user nova-compute[71474]: DEBUG nova.network.neutron [req-a1df6cef-89fa-4e75-b8e2-6a1cb2d948d4 req-5afb44c8-9c41-49f3-87a8-11844da88d2b service nova] [instance: 3af27bc9-9617-44c7-bfa4-993b347d183c] Updating instance_info_cache with network_info: [{"id": "7e465dfa-8ae6-4806-b18d-e23dcbf0a97d", "address": "fa:16:3e:da:eb:fa", "network": {"id": "1815b48e-38a4-4a83-a23b-d7c2ce38a2c3", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1971948253-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8a8fedc10f324a92aef4142ab7efdd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e465dfa-8a", "ovs_interfaceid": "7e465dfa-8ae6-4806-b18d-e23dcbf0a97d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 13:58:22 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-a1df6cef-89fa-4e75-b8e2-6a1cb2d948d4 req-5afb44c8-9c41-49f3-87a8-11844da88d2b service nova] Releasing lock "refresh_cache-3af27bc9-9617-44c7-bfa4-993b347d183c" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 13:58:22 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:22 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:22 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:22 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:23 user nova-compute[71474]: DEBUG nova.network.neutron [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] Successfully updated port: 2616f5a4-1b53-44bd-82ad-65419e2839ca {{(pid=71474) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 21 13:58:23 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] Acquiring lock "refresh_cache-2c5afe45-87ae-477a-8bf0-6a5e2036fb68" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 13:58:23 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] Acquired lock "refresh_cache-2c5afe45-87ae-477a-8bf0-6a5e2036fb68" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 13:58:23 user nova-compute[71474]: DEBUG nova.network.neutron [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] Building network info cache for instance {{(pid=71474) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 21 13:58:23 user nova-compute[71474]: DEBUG nova.compute.manager [req-e59d1d6f-d37c-435f-acea-83d6a28200ed req-a1b114e2-4085-4f9e-8a34-1087e64ceac6 service nova] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] Received event network-changed-2616f5a4-1b53-44bd-82ad-65419e2839ca {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 13:58:23 user nova-compute[71474]: DEBUG nova.compute.manager [req-e59d1d6f-d37c-435f-acea-83d6a28200ed req-a1b114e2-4085-4f9e-8a34-1087e64ceac6 service nova] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] Refreshing instance network info cache due to event network-changed-2616f5a4-1b53-44bd-82ad-65419e2839ca. {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 21 13:58:23 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-e59d1d6f-d37c-435f-acea-83d6a28200ed req-a1b114e2-4085-4f9e-8a34-1087e64ceac6 service nova] Acquiring lock "refresh_cache-2c5afe45-87ae-477a-8bf0-6a5e2036fb68" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 13:58:23 user nova-compute[71474]: DEBUG nova.network.neutron [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] Instance cache missing network info. {{(pid=71474) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 21 13:58:23 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:23 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:23 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:23 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:23 user nova-compute[71474]: DEBUG nova.network.neutron [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] Updating instance_info_cache with network_info: [{"id": "2616f5a4-1b53-44bd-82ad-65419e2839ca", "address": "fa:16:3e:52:41:c0", "network": {"id": "43525cbd-9d02-4cd7-b457-b26a485106f5", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1453701117-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "b4c4270d6dfa435f94da018d12586bcd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2616f5a4-1b", "ovs_interfaceid": "2616f5a4-1b53-44bd-82ad-65419e2839ca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 13:58:23 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] Releasing lock "refresh_cache-2c5afe45-87ae-477a-8bf0-6a5e2036fb68" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 13:58:23 user nova-compute[71474]: DEBUG nova.compute.manager [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] Instance network_info: |[{"id": "2616f5a4-1b53-44bd-82ad-65419e2839ca", "address": "fa:16:3e:52:41:c0", "network": {"id": "43525cbd-9d02-4cd7-b457-b26a485106f5", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1453701117-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "b4c4270d6dfa435f94da018d12586bcd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2616f5a4-1b", "ovs_interfaceid": "2616f5a4-1b53-44bd-82ad-65419e2839ca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71474) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 21 13:58:23 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-e59d1d6f-d37c-435f-acea-83d6a28200ed req-a1b114e2-4085-4f9e-8a34-1087e64ceac6 service nova] Acquired lock "refresh_cache-2c5afe45-87ae-477a-8bf0-6a5e2036fb68" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 13:58:23 user nova-compute[71474]: DEBUG nova.network.neutron [req-e59d1d6f-d37c-435f-acea-83d6a28200ed req-a1b114e2-4085-4f9e-8a34-1087e64ceac6 service nova] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] Refreshing network info cache for port 2616f5a4-1b53-44bd-82ad-65419e2839ca {{(pid=71474) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 21 13:58:23 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] Start _get_guest_xml network_info=[{"id": "2616f5a4-1b53-44bd-82ad-65419e2839ca", "address": "fa:16:3e:52:41:c0", "network": {"id": "43525cbd-9d02-4cd7-b457-b26a485106f5", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1453701117-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "b4c4270d6dfa435f94da018d12586bcd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2616f5a4-1b", "ovs_interfaceid": "2616f5a4-1b53-44bd-82ad-65419e2839ca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T13:54:16Z,direct_url=,disk_format='qcow2',id=2edfef44-2867-4e03-a53e-b139f99afa75,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='36a44032fda748c1965c722304fa176d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T13:54:18Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'guest_format': None, 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'image_id': '2edfef44-2867-4e03-a53e-b139f99afa75'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71474) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 21 13:58:23 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 13:58:23 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 13:58:23 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71474) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 21 13:58:23 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-21T13:55:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T13:54:16Z,direct_url=,disk_format='qcow2',id=2edfef44-2867-4e03-a53e-b139f99afa75,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='36a44032fda748c1965c722304fa176d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T13:54:18Z,virtual_size=,visibility=), allow threads: True {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 21 13:58:23 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] Flavor limits 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 21 13:58:23 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] Image limits 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 21 13:58:23 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] Flavor pref 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 21 13:58:23 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] Image pref 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 21 13:58:23 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 21 13:58:23 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 21 13:58:23 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71474) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 21 13:58:23 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] Got 1 possible topologies {{(pid=71474) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 21 13:58:23 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 21 13:58:23 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 21 13:58:23 user nova-compute[71474]: DEBUG nova.virt.libvirt.vif [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T13:58:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-1149674532',display_name='tempest-ServerStableDeviceRescueTest-server-1149674532',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverstabledevicerescuetest-server-1149674532',id=4,image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMkCCUCEByuZA8uefKTCCf6BwNOi3GFQvMJ7eA+GdJBuYKCUigOvF7jv5smuTcvHYLmZKP4LkvWhlc4WMHNO3mTFd+RXuNxX7VqhNcJysaZOOp2XhD7KgmsEEHk9+iiuvQ==',key_name='tempest-keypair-1535115748',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b4c4270d6dfa435f94da018d12586bcd',ramdisk_id='',reservation_id='r-w1rsfwl0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerStableDeviceRescueTest-1083322898',owner_user_name='tempest-ServerStableDeviceRescueTest-1083322898-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T13:58:18Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0d7d1e7446af4edf8e35a9d0178b2895',uuid=2c5afe45-87ae-477a-8bf0-6a5e2036fb68,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2616f5a4-1b53-44bd-82ad-65419e2839ca", "address": "fa:16:3e:52:41:c0", "network": {"id": "43525cbd-9d02-4cd7-b457-b26a485106f5", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1453701117-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "b4c4270d6dfa435f94da018d12586bcd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2616f5a4-1b", "ovs_interfaceid": "2616f5a4-1b53-44bd-82ad-65419e2839ca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71474) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 21 13:58:23 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] Converting VIF {"id": "2616f5a4-1b53-44bd-82ad-65419e2839ca", "address": "fa:16:3e:52:41:c0", "network": {"id": "43525cbd-9d02-4cd7-b457-b26a485106f5", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1453701117-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "b4c4270d6dfa435f94da018d12586bcd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2616f5a4-1b", "ovs_interfaceid": "2616f5a4-1b53-44bd-82ad-65419e2839ca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 13:58:23 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:52:41:c0,bridge_name='br-int',has_traffic_filtering=True,id=2616f5a4-1b53-44bd-82ad-65419e2839ca,network=Network(43525cbd-9d02-4cd7-b457-b26a485106f5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2616f5a4-1b') {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 13:58:23 user nova-compute[71474]: DEBUG nova.objects.instance [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] Lazy-loading 'pci_devices' on Instance uuid 2c5afe45-87ae-477a-8bf0-6a5e2036fb68 {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 13:58:23 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] End _get_guest_xml xml= Apr 21 13:58:23 user nova-compute[71474]: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68 Apr 21 13:58:23 user nova-compute[71474]: instance-00000004 Apr 21 13:58:23 user nova-compute[71474]: 131072 Apr 21 13:58:23 user nova-compute[71474]: 1 Apr 21 13:58:23 user nova-compute[71474]: Apr 21 13:58:23 user nova-compute[71474]: Apr 21 13:58:23 user nova-compute[71474]: Apr 21 13:58:23 user nova-compute[71474]: tempest-ServerStableDeviceRescueTest-server-1149674532 Apr 21 13:58:23 user nova-compute[71474]: 2023-04-21 13:58:23 Apr 21 13:58:23 user nova-compute[71474]: Apr 21 13:58:23 user nova-compute[71474]: 128 Apr 21 13:58:23 user nova-compute[71474]: 1 Apr 21 13:58:23 user nova-compute[71474]: 0 Apr 21 13:58:23 user nova-compute[71474]: 0 Apr 21 13:58:23 user nova-compute[71474]: 1 Apr 21 13:58:23 user nova-compute[71474]: Apr 21 13:58:23 user nova-compute[71474]: Apr 21 13:58:23 user nova-compute[71474]: tempest-ServerStableDeviceRescueTest-1083322898-project-member Apr 21 13:58:23 user nova-compute[71474]: tempest-ServerStableDeviceRescueTest-1083322898 Apr 21 13:58:23 user nova-compute[71474]: Apr 21 13:58:23 user nova-compute[71474]: Apr 21 13:58:23 user nova-compute[71474]: Apr 21 13:58:23 user nova-compute[71474]: Apr 21 13:58:23 user nova-compute[71474]: Apr 21 13:58:23 user nova-compute[71474]: Apr 21 13:58:23 user nova-compute[71474]: Apr 21 13:58:23 user nova-compute[71474]: Apr 21 13:58:23 user nova-compute[71474]: Apr 21 13:58:23 user nova-compute[71474]: Apr 21 13:58:23 user nova-compute[71474]: Apr 21 13:58:23 user nova-compute[71474]: OpenStack Foundation Apr 21 13:58:23 user nova-compute[71474]: OpenStack Nova Apr 21 13:58:23 user nova-compute[71474]: 0.0.0 Apr 21 13:58:23 user nova-compute[71474]: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68 Apr 21 13:58:23 user nova-compute[71474]: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68 Apr 21 13:58:23 user nova-compute[71474]: Virtual Machine Apr 21 13:58:23 user nova-compute[71474]: Apr 21 13:58:23 user nova-compute[71474]: Apr 21 13:58:23 user nova-compute[71474]: Apr 21 13:58:23 user nova-compute[71474]: hvm Apr 21 13:58:23 user nova-compute[71474]: Apr 21 13:58:23 user nova-compute[71474]: Apr 21 13:58:23 user nova-compute[71474]: Apr 21 13:58:23 user nova-compute[71474]: Apr 21 13:58:23 user nova-compute[71474]: Apr 21 13:58:23 user nova-compute[71474]: Apr 21 13:58:23 user nova-compute[71474]: Apr 21 13:58:23 user nova-compute[71474]: Apr 21 13:58:23 user nova-compute[71474]: Apr 21 13:58:23 user nova-compute[71474]: Apr 21 13:58:23 user nova-compute[71474]: Apr 21 13:58:23 user nova-compute[71474]: Apr 21 13:58:23 user nova-compute[71474]: Apr 21 13:58:23 user nova-compute[71474]: Apr 21 13:58:23 user nova-compute[71474]: Nehalem Apr 21 13:58:23 user nova-compute[71474]: Apr 21 13:58:23 user nova-compute[71474]: Apr 21 13:58:23 user nova-compute[71474]: Apr 21 13:58:23 user nova-compute[71474]: Apr 21 13:58:23 user nova-compute[71474]: Apr 21 13:58:23 user nova-compute[71474]: Apr 21 13:58:23 user nova-compute[71474]: Apr 21 13:58:23 user nova-compute[71474]: Apr 21 13:58:23 user nova-compute[71474]: Apr 21 13:58:23 user nova-compute[71474]: Apr 21 13:58:23 user nova-compute[71474]: Apr 21 13:58:23 user nova-compute[71474]: Apr 21 13:58:23 user nova-compute[71474]: Apr 21 13:58:23 user nova-compute[71474]: Apr 21 13:58:23 user nova-compute[71474]: Apr 21 13:58:23 user nova-compute[71474]: Apr 21 13:58:23 user nova-compute[71474]: Apr 21 13:58:23 user nova-compute[71474]: Apr 21 13:58:23 user nova-compute[71474]: Apr 21 13:58:23 user nova-compute[71474]: Apr 21 13:58:23 user nova-compute[71474]: /dev/urandom Apr 21 13:58:23 user nova-compute[71474]: Apr 21 13:58:23 user nova-compute[71474]: Apr 21 13:58:23 user nova-compute[71474]: Apr 21 13:58:23 user nova-compute[71474]: Apr 21 13:58:23 user nova-compute[71474]: Apr 21 13:58:23 user nova-compute[71474]: Apr 21 13:58:23 user nova-compute[71474]: Apr 21 13:58:23 user nova-compute[71474]: {{(pid=71474) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 21 13:58:23 user nova-compute[71474]: DEBUG nova.virt.libvirt.vif [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T13:58:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-1149674532',display_name='tempest-ServerStableDeviceRescueTest-server-1149674532',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverstabledevicerescuetest-server-1149674532',id=4,image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMkCCUCEByuZA8uefKTCCf6BwNOi3GFQvMJ7eA+GdJBuYKCUigOvF7jv5smuTcvHYLmZKP4LkvWhlc4WMHNO3mTFd+RXuNxX7VqhNcJysaZOOp2XhD7KgmsEEHk9+iiuvQ==',key_name='tempest-keypair-1535115748',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b4c4270d6dfa435f94da018d12586bcd',ramdisk_id='',reservation_id='r-w1rsfwl0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerStableDeviceRescueTest-1083322898',owner_user_name='tempest-ServerStableDeviceRescueTest-1083322898-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T13:58:18Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0d7d1e7446af4edf8e35a9d0178b2895',uuid=2c5afe45-87ae-477a-8bf0-6a5e2036fb68,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2616f5a4-1b53-44bd-82ad-65419e2839ca", "address": "fa:16:3e:52:41:c0", "network": {"id": "43525cbd-9d02-4cd7-b457-b26a485106f5", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1453701117-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "b4c4270d6dfa435f94da018d12586bcd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2616f5a4-1b", "ovs_interfaceid": "2616f5a4-1b53-44bd-82ad-65419e2839ca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 21 13:58:23 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] Converting VIF {"id": "2616f5a4-1b53-44bd-82ad-65419e2839ca", "address": "fa:16:3e:52:41:c0", "network": {"id": "43525cbd-9d02-4cd7-b457-b26a485106f5", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1453701117-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "b4c4270d6dfa435f94da018d12586bcd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2616f5a4-1b", "ovs_interfaceid": "2616f5a4-1b53-44bd-82ad-65419e2839ca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 13:58:23 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:52:41:c0,bridge_name='br-int',has_traffic_filtering=True,id=2616f5a4-1b53-44bd-82ad-65419e2839ca,network=Network(43525cbd-9d02-4cd7-b457-b26a485106f5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2616f5a4-1b') {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 13:58:23 user nova-compute[71474]: DEBUG os_vif [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:52:41:c0,bridge_name='br-int',has_traffic_filtering=True,id=2616f5a4-1b53-44bd-82ad-65419e2839ca,network=Network(43525cbd-9d02-4cd7-b457-b26a485106f5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2616f5a4-1b') {{(pid=71474) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 21 13:58:23 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:23 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 13:58:23 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 21 13:58:23 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:23 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2616f5a4-1b, may_exist=True) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 13:58:23 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2616f5a4-1b, col_values=(('external_ids', {'iface-id': '2616f5a4-1b53-44bd-82ad-65419e2839ca', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:52:41:c0', 'vm-uuid': '2c5afe45-87ae-477a-8bf0-6a5e2036fb68'}),)) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 13:58:23 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:23 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 13:58:23 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:23 user nova-compute[71474]: INFO os_vif [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:52:41:c0,bridge_name='br-int',has_traffic_filtering=True,id=2616f5a4-1b53-44bd-82ad-65419e2839ca,network=Network(43525cbd-9d02-4cd7-b457-b26a485106f5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2616f5a4-1b') Apr 21 13:58:23 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] No BDM found with device name vda, not building metadata. {{(pid=71474) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 21 13:58:23 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] No VIF found with MAC fa:16:3e:52:41:c0, not building metadata {{(pid=71474) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 21 13:58:24 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:24 user nova-compute[71474]: DEBUG nova.compute.manager [req-20b9aa2d-fec6-42a1-a528-5827fd76c3bd req-e8412dd6-3366-4f1f-9814-3837067bfb61 service nova] [instance: 3af27bc9-9617-44c7-bfa4-993b347d183c] Received event network-vif-plugged-7e465dfa-8ae6-4806-b18d-e23dcbf0a97d {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 13:58:24 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-20b9aa2d-fec6-42a1-a528-5827fd76c3bd req-e8412dd6-3366-4f1f-9814-3837067bfb61 service nova] Acquiring lock "3af27bc9-9617-44c7-bfa4-993b347d183c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:58:24 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-20b9aa2d-fec6-42a1-a528-5827fd76c3bd req-e8412dd6-3366-4f1f-9814-3837067bfb61 service nova] Lock "3af27bc9-9617-44c7-bfa4-993b347d183c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.015s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:58:24 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-20b9aa2d-fec6-42a1-a528-5827fd76c3bd req-e8412dd6-3366-4f1f-9814-3837067bfb61 service nova] Lock "3af27bc9-9617-44c7-bfa4-993b347d183c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.006s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:58:24 user nova-compute[71474]: DEBUG nova.compute.manager [req-20b9aa2d-fec6-42a1-a528-5827fd76c3bd req-e8412dd6-3366-4f1f-9814-3837067bfb61 service nova] [instance: 3af27bc9-9617-44c7-bfa4-993b347d183c] No waiting events found dispatching network-vif-plugged-7e465dfa-8ae6-4806-b18d-e23dcbf0a97d {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 13:58:24 user nova-compute[71474]: WARNING nova.compute.manager [req-20b9aa2d-fec6-42a1-a528-5827fd76c3bd req-e8412dd6-3366-4f1f-9814-3837067bfb61 service nova] [instance: 3af27bc9-9617-44c7-bfa4-993b347d183c] Received unexpected event network-vif-plugged-7e465dfa-8ae6-4806-b18d-e23dcbf0a97d for instance with vm_state building and task_state spawning. Apr 21 13:58:24 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:24 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:24 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:24 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:24 user nova-compute[71474]: DEBUG nova.network.neutron [req-e59d1d6f-d37c-435f-acea-83d6a28200ed req-a1b114e2-4085-4f9e-8a34-1087e64ceac6 service nova] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] Updated VIF entry in instance network info cache for port 2616f5a4-1b53-44bd-82ad-65419e2839ca. {{(pid=71474) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 21 13:58:24 user nova-compute[71474]: DEBUG nova.network.neutron [req-e59d1d6f-d37c-435f-acea-83d6a28200ed req-a1b114e2-4085-4f9e-8a34-1087e64ceac6 service nova] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] Updating instance_info_cache with network_info: [{"id": "2616f5a4-1b53-44bd-82ad-65419e2839ca", "address": "fa:16:3e:52:41:c0", "network": {"id": "43525cbd-9d02-4cd7-b457-b26a485106f5", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1453701117-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "b4c4270d6dfa435f94da018d12586bcd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2616f5a4-1b", "ovs_interfaceid": "2616f5a4-1b53-44bd-82ad-65419e2839ca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 13:58:24 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-e59d1d6f-d37c-435f-acea-83d6a28200ed req-a1b114e2-4085-4f9e-8a34-1087e64ceac6 service nova] Releasing lock "refresh_cache-2c5afe45-87ae-477a-8bf0-6a5e2036fb68" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 13:58:24 user nova-compute[71474]: DEBUG nova.virt.driver [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Emitting event Resumed> {{(pid=71474) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 13:58:24 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 3af27bc9-9617-44c7-bfa4-993b347d183c] VM Resumed (Lifecycle Event) Apr 21 13:58:24 user nova-compute[71474]: DEBUG nova.compute.manager [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] [instance: 3af27bc9-9617-44c7-bfa4-993b347d183c] Instance event wait completed in 0 seconds for {{(pid=71474) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 21 13:58:24 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] [instance: 3af27bc9-9617-44c7-bfa4-993b347d183c] Guest created on hypervisor {{(pid=71474) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 21 13:58:24 user nova-compute[71474]: INFO nova.virt.libvirt.driver [-] [instance: 3af27bc9-9617-44c7-bfa4-993b347d183c] Instance spawned successfully. Apr 21 13:58:24 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] [instance: 3af27bc9-9617-44c7-bfa4-993b347d183c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 21 13:58:24 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 3af27bc9-9617-44c7-bfa4-993b347d183c] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 13:58:24 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 3af27bc9-9617-44c7-bfa4-993b347d183c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71474) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 13:58:24 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] [instance: 3af27bc9-9617-44c7-bfa4-993b347d183c] Found default for hw_cdrom_bus of ide {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 13:58:24 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] [instance: 3af27bc9-9617-44c7-bfa4-993b347d183c] Found default for hw_disk_bus of virtio {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 13:58:24 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] [instance: 3af27bc9-9617-44c7-bfa4-993b347d183c] Found default for hw_input_bus of None {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 13:58:24 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] [instance: 3af27bc9-9617-44c7-bfa4-993b347d183c] Found default for hw_pointer_model of None {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 13:58:24 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] [instance: 3af27bc9-9617-44c7-bfa4-993b347d183c] Found default for hw_video_model of virtio {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 13:58:24 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] [instance: 3af27bc9-9617-44c7-bfa4-993b347d183c] Found default for hw_vif_model of virtio {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 13:58:24 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 3af27bc9-9617-44c7-bfa4-993b347d183c] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 13:58:24 user nova-compute[71474]: DEBUG nova.virt.driver [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Emitting event Started> {{(pid=71474) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 13:58:24 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 3af27bc9-9617-44c7-bfa4-993b347d183c] VM Started (Lifecycle Event) Apr 21 13:58:25 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 3af27bc9-9617-44c7-bfa4-993b347d183c] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 13:58:25 user nova-compute[71474]: INFO nova.compute.manager [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] [instance: 3af27bc9-9617-44c7-bfa4-993b347d183c] Took 9.28 seconds to spawn the instance on the hypervisor. Apr 21 13:58:25 user nova-compute[71474]: DEBUG nova.compute.manager [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] [instance: 3af27bc9-9617-44c7-bfa4-993b347d183c] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 13:58:25 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 3af27bc9-9617-44c7-bfa4-993b347d183c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71474) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 13:58:25 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 3af27bc9-9617-44c7-bfa4-993b347d183c] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 13:58:25 user nova-compute[71474]: INFO nova.compute.manager [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] [instance: 3af27bc9-9617-44c7-bfa4-993b347d183c] Took 10.41 seconds to build instance. Apr 21 13:58:25 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-b5b2956a-889c-4938-8fee-9018b295eb78 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Lock "3af27bc9-9617-44c7-bfa4-993b347d183c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 10.595s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:58:25 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:25 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:25 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:25 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:25 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:25 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:26 user nova-compute[71474]: DEBUG nova.compute.manager [req-483d3b5c-66d1-4d67-8bbe-0bfe0ec70fb6 req-c9ba8c56-eac6-49d6-bb21-3eee67e4dcd8 service nova] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] Received event network-vif-plugged-2616f5a4-1b53-44bd-82ad-65419e2839ca {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 13:58:26 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-483d3b5c-66d1-4d67-8bbe-0bfe0ec70fb6 req-c9ba8c56-eac6-49d6-bb21-3eee67e4dcd8 service nova] Acquiring lock "2c5afe45-87ae-477a-8bf0-6a5e2036fb68-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:58:26 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-483d3b5c-66d1-4d67-8bbe-0bfe0ec70fb6 req-c9ba8c56-eac6-49d6-bb21-3eee67e4dcd8 service nova] Lock "2c5afe45-87ae-477a-8bf0-6a5e2036fb68-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:58:26 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-483d3b5c-66d1-4d67-8bbe-0bfe0ec70fb6 req-c9ba8c56-eac6-49d6-bb21-3eee67e4dcd8 service nova] Lock "2c5afe45-87ae-477a-8bf0-6a5e2036fb68-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:58:26 user nova-compute[71474]: DEBUG nova.compute.manager [req-483d3b5c-66d1-4d67-8bbe-0bfe0ec70fb6 req-c9ba8c56-eac6-49d6-bb21-3eee67e4dcd8 service nova] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] No waiting events found dispatching network-vif-plugged-2616f5a4-1b53-44bd-82ad-65419e2839ca {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 13:58:26 user nova-compute[71474]: WARNING nova.compute.manager [req-483d3b5c-66d1-4d67-8bbe-0bfe0ec70fb6 req-c9ba8c56-eac6-49d6-bb21-3eee67e4dcd8 service nova] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] Received unexpected event network-vif-plugged-2616f5a4-1b53-44bd-82ad-65419e2839ca for instance with vm_state building and task_state spawning. Apr 21 13:58:26 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:26 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:26 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:26 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:26 user nova-compute[71474]: DEBUG nova.compute.manager [req-13e6d233-7179-4e0b-882d-168d7cb0390d req-60bcda29-4bdd-48f0-a07c-48e18b379aa6 service nova] [instance: 3af27bc9-9617-44c7-bfa4-993b347d183c] Received event network-vif-plugged-7e465dfa-8ae6-4806-b18d-e23dcbf0a97d {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 13:58:26 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-13e6d233-7179-4e0b-882d-168d7cb0390d req-60bcda29-4bdd-48f0-a07c-48e18b379aa6 service nova] Acquiring lock "3af27bc9-9617-44c7-bfa4-993b347d183c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:58:26 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-13e6d233-7179-4e0b-882d-168d7cb0390d req-60bcda29-4bdd-48f0-a07c-48e18b379aa6 service nova] Lock "3af27bc9-9617-44c7-bfa4-993b347d183c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.002s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:58:26 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-13e6d233-7179-4e0b-882d-168d7cb0390d req-60bcda29-4bdd-48f0-a07c-48e18b379aa6 service nova] Lock "3af27bc9-9617-44c7-bfa4-993b347d183c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:58:26 user nova-compute[71474]: DEBUG nova.compute.manager [req-13e6d233-7179-4e0b-882d-168d7cb0390d req-60bcda29-4bdd-48f0-a07c-48e18b379aa6 service nova] [instance: 3af27bc9-9617-44c7-bfa4-993b347d183c] No waiting events found dispatching network-vif-plugged-7e465dfa-8ae6-4806-b18d-e23dcbf0a97d {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 13:58:26 user nova-compute[71474]: WARNING nova.compute.manager [req-13e6d233-7179-4e0b-882d-168d7cb0390d req-60bcda29-4bdd-48f0-a07c-48e18b379aa6 service nova] [instance: 3af27bc9-9617-44c7-bfa4-993b347d183c] Received unexpected event network-vif-plugged-7e465dfa-8ae6-4806-b18d-e23dcbf0a97d for instance with vm_state active and task_state None. Apr 21 13:58:26 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:27 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:27 user nova-compute[71474]: DEBUG nova.virt.driver [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Emitting event Resumed> {{(pid=71474) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 13:58:27 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] VM Resumed (Lifecycle Event) Apr 21 13:58:27 user nova-compute[71474]: DEBUG nova.compute.manager [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] Instance event wait completed in 0 seconds for {{(pid=71474) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 21 13:58:27 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] Guest created on hypervisor {{(pid=71474) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 21 13:58:27 user nova-compute[71474]: INFO nova.virt.libvirt.driver [-] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] Instance spawned successfully. Apr 21 13:58:27 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 21 13:58:27 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 13:58:27 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71474) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 13:58:27 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] Found default for hw_cdrom_bus of ide {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 13:58:27 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] Found default for hw_disk_bus of virtio {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 13:58:27 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] Found default for hw_input_bus of None {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 13:58:27 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] Found default for hw_pointer_model of None {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 13:58:27 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] Found default for hw_video_model of virtio {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 13:58:27 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] Found default for hw_vif_model of virtio {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 13:58:27 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 13:58:27 user nova-compute[71474]: DEBUG nova.virt.driver [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Emitting event Started> {{(pid=71474) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 13:58:27 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] VM Started (Lifecycle Event) Apr 21 13:58:28 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 13:58:28 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71474) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 13:58:28 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 13:58:28 user nova-compute[71474]: INFO nova.compute.manager [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] Took 9.75 seconds to spawn the instance on the hypervisor. Apr 21 13:58:28 user nova-compute[71474]: DEBUG nova.compute.manager [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 13:58:28 user nova-compute[71474]: INFO nova.compute.manager [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] Took 10.48 seconds to build instance. Apr 21 13:58:28 user nova-compute[71474]: DEBUG nova.compute.manager [req-1425732b-b125-42e3-8cdf-223d318841cb req-4089485a-cc91-474e-80ed-e30ace685c9e service nova] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] Received event network-vif-plugged-2616f5a4-1b53-44bd-82ad-65419e2839ca {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 13:58:28 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-1425732b-b125-42e3-8cdf-223d318841cb req-4089485a-cc91-474e-80ed-e30ace685c9e service nova] Acquiring lock "2c5afe45-87ae-477a-8bf0-6a5e2036fb68-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:58:28 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-1425732b-b125-42e3-8cdf-223d318841cb req-4089485a-cc91-474e-80ed-e30ace685c9e service nova] Lock "2c5afe45-87ae-477a-8bf0-6a5e2036fb68-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:58:28 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-1425732b-b125-42e3-8cdf-223d318841cb req-4089485a-cc91-474e-80ed-e30ace685c9e service nova] Lock "2c5afe45-87ae-477a-8bf0-6a5e2036fb68-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:58:28 user nova-compute[71474]: DEBUG nova.compute.manager [req-1425732b-b125-42e3-8cdf-223d318841cb req-4089485a-cc91-474e-80ed-e30ace685c9e service nova] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] No waiting events found dispatching network-vif-plugged-2616f5a4-1b53-44bd-82ad-65419e2839ca {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 13:58:28 user nova-compute[71474]: WARNING nova.compute.manager [req-1425732b-b125-42e3-8cdf-223d318841cb req-4089485a-cc91-474e-80ed-e30ace685c9e service nova] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] Received unexpected event network-vif-plugged-2616f5a4-1b53-44bd-82ad-65419e2839ca for instance with vm_state active and task_state None. Apr 21 13:58:28 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-8a093d00-1bc2-45b6-a52a-203089bc997b tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] Lock "2c5afe45-87ae-477a-8bf0-6a5e2036fb68" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 10.641s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:58:28 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:30 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:31 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] Acquiring lock "0346fbd8-64cd-45e7-906f-e00eeece91ce" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:58:31 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] Lock "0346fbd8-64cd-45e7-906f-e00eeece91ce" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:58:31 user nova-compute[71474]: DEBUG nova.compute.manager [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] [instance: 0346fbd8-64cd-45e7-906f-e00eeece91ce] Starting instance... {{(pid=71474) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 21 13:58:31 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:58:31 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.002s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:58:31 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71474) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 21 13:58:31 user nova-compute[71474]: INFO nova.compute.claims [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] [instance: 0346fbd8-64cd-45e7-906f-e00eeece91ce] Claim successful on node user Apr 21 13:58:31 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] Inventory has not changed in ProviderTree for provider: 4e62c1ab-67bb-43ed-8389-61deb50e98d7 {{(pid=71474) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 13:58:31 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] Inventory has not changed for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 13:58:31 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.671s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:58:31 user nova-compute[71474]: DEBUG nova.compute.manager [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] [instance: 0346fbd8-64cd-45e7-906f-e00eeece91ce] Start building networks asynchronously for instance. {{(pid=71474) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 21 13:58:32 user nova-compute[71474]: DEBUG nova.compute.manager [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] [instance: 0346fbd8-64cd-45e7-906f-e00eeece91ce] Allocating IP information in the background. {{(pid=71474) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 21 13:58:32 user nova-compute[71474]: DEBUG nova.network.neutron [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] [instance: 0346fbd8-64cd-45e7-906f-e00eeece91ce] allocate_for_instance() {{(pid=71474) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 21 13:58:32 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] [instance: 0346fbd8-64cd-45e7-906f-e00eeece91ce] Ignoring supplied device name: /dev/sda. Libvirt can't honour user-supplied dev names Apr 21 13:58:32 user nova-compute[71474]: DEBUG nova.compute.manager [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] [instance: 0346fbd8-64cd-45e7-906f-e00eeece91ce] Start building block device mappings for instance. {{(pid=71474) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 21 13:58:32 user nova-compute[71474]: DEBUG nova.compute.manager [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] [instance: 0346fbd8-64cd-45e7-906f-e00eeece91ce] Start spawning the instance on the hypervisor. {{(pid=71474) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 21 13:58:32 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] [instance: 0346fbd8-64cd-45e7-906f-e00eeece91ce] Creating instance directory {{(pid=71474) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 21 13:58:32 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] [instance: 0346fbd8-64cd-45e7-906f-e00eeece91ce] Creating image(s) Apr 21 13:58:32 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] Acquiring lock "/opt/stack/data/nova/instances/0346fbd8-64cd-45e7-906f-e00eeece91ce/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:58:32 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] Lock "/opt/stack/data/nova/instances/0346fbd8-64cd-45e7-906f-e00eeece91ce/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:58:32 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] Lock "/opt/stack/data/nova/instances/0346fbd8-64cd-45e7-906f-e00eeece91ce/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.003s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:58:32 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] Acquiring lock "089bf89850d69f97c278c75135152fce198f26af" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:58:32 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] Lock "089bf89850d69f97c278c75135152fce198f26af" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.002s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:58:32 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/089bf89850d69f97c278c75135152fce198f26af.part --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:58:32 user nova-compute[71474]: DEBUG nova.policy [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7ac1dc66f96249f884f9b4dbc73f37b4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1953479f081341b088eaecb8369fce0b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71474) authorize /opt/stack/nova/nova/policy.py:203}} Apr 21 13:58:32 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:32 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/089bf89850d69f97c278c75135152fce198f26af.part --force-share --output=json" returned: 0 in 0.171s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:58:32 user nova-compute[71474]: DEBUG nova.virt.images [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] 2be63a0e-17cf-4166-a64b-96ec2f419df8 was qcow2, converting to raw {{(pid=71474) fetch_to_raw /opt/stack/nova/nova/virt/images.py:165}} Apr 21 13:58:32 user nova-compute[71474]: DEBUG nova.privsep.utils [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=71474) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Apr 21 13:58:32 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/089bf89850d69f97c278c75135152fce198f26af.part /opt/stack/data/nova/instances/_base/089bf89850d69f97c278c75135152fce198f26af.converted {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:58:33 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] CMD "qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/089bf89850d69f97c278c75135152fce198f26af.part /opt/stack/data/nova/instances/_base/089bf89850d69f97c278c75135152fce198f26af.converted" returned: 0 in 0.250s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:58:33 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/089bf89850d69f97c278c75135152fce198f26af.converted --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:58:33 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/089bf89850d69f97c278c75135152fce198f26af.converted --force-share --output=json" returned: 0 in 0.188s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:58:33 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] Lock "089bf89850d69f97c278c75135152fce198f26af" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 1.002s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:58:33 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/089bf89850d69f97c278c75135152fce198f26af --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:58:33 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/089bf89850d69f97c278c75135152fce198f26af --force-share --output=json" returned: 0 in 0.209s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:58:33 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] Acquiring lock "089bf89850d69f97c278c75135152fce198f26af" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:58:33 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] Lock "089bf89850d69f97c278c75135152fce198f26af" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:58:33 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/089bf89850d69f97c278c75135152fce198f26af --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:58:33 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/089bf89850d69f97c278c75135152fce198f26af --force-share --output=json" returned: 0 in 0.140s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:58:33 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/089bf89850d69f97c278c75135152fce198f26af,backing_fmt=raw /opt/stack/data/nova/instances/0346fbd8-64cd-45e7-906f-e00eeece91ce/disk 1073741824 {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:58:33 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/089bf89850d69f97c278c75135152fce198f26af,backing_fmt=raw /opt/stack/data/nova/instances/0346fbd8-64cd-45e7-906f-e00eeece91ce/disk 1073741824" returned: 0 in 0.061s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:58:33 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] Lock "089bf89850d69f97c278c75135152fce198f26af" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.211s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:58:33 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/089bf89850d69f97c278c75135152fce198f26af --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:58:33 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:33 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/089bf89850d69f97c278c75135152fce198f26af --force-share --output=json" returned: 0 in 0.201s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:58:33 user nova-compute[71474]: DEBUG nova.virt.disk.api [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] Checking if we can resize image /opt/stack/data/nova/instances/0346fbd8-64cd-45e7-906f-e00eeece91ce/disk. size=1073741824 {{(pid=71474) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 21 13:58:33 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/0346fbd8-64cd-45e7-906f-e00eeece91ce/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:58:34 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Acquiring lock "f0f32b68-6993-4843-bcc6-bd0e06377b27" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:58:34 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Lock "f0f32b68-6993-4843-bcc6-bd0e06377b27" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.002s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:58:34 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/0346fbd8-64cd-45e7-906f-e00eeece91ce/disk --force-share --output=json" returned: 0 in 0.256s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:58:34 user nova-compute[71474]: DEBUG nova.virt.disk.api [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] Cannot resize image /opt/stack/data/nova/instances/0346fbd8-64cd-45e7-906f-e00eeece91ce/disk to a smaller size. {{(pid=71474) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 21 13:58:34 user nova-compute[71474]: DEBUG nova.objects.instance [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] Lazy-loading 'migration_context' on Instance uuid 0346fbd8-64cd-45e7-906f-e00eeece91ce {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 13:58:34 user nova-compute[71474]: DEBUG nova.compute.manager [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] Starting instance... {{(pid=71474) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 21 13:58:34 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] [instance: 0346fbd8-64cd-45e7-906f-e00eeece91ce] Created local disks {{(pid=71474) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 21 13:58:34 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] [instance: 0346fbd8-64cd-45e7-906f-e00eeece91ce] Ensure instance console log exists: /opt/stack/data/nova/instances/0346fbd8-64cd-45e7-906f-e00eeece91ce/console.log {{(pid=71474) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 21 13:58:34 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:58:34 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:58:34 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:58:34 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:58:34 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.002s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:58:34 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71474) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 21 13:58:34 user nova-compute[71474]: INFO nova.compute.claims [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] Claim successful on node user Apr 21 13:58:34 user nova-compute[71474]: DEBUG nova.network.neutron [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] [instance: 0346fbd8-64cd-45e7-906f-e00eeece91ce] Successfully created port: 94bfdf6c-66ef-44cc-ab29-c1b1597cf6a8 {{(pid=71474) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 21 13:58:34 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Inventory has not changed in ProviderTree for provider: 4e62c1ab-67bb-43ed-8389-61deb50e98d7 {{(pid=71474) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 13:58:34 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Inventory has not changed for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 13:58:34 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.437s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:58:34 user nova-compute[71474]: DEBUG nova.compute.manager [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] Start building networks asynchronously for instance. {{(pid=71474) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 21 13:58:34 user nova-compute[71474]: DEBUG nova.compute.manager [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] Allocating IP information in the background. {{(pid=71474) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 21 13:58:34 user nova-compute[71474]: DEBUG nova.network.neutron [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] allocate_for_instance() {{(pid=71474) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 21 13:58:34 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 21 13:58:34 user nova-compute[71474]: DEBUG nova.compute.manager [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] Start building block device mappings for instance. {{(pid=71474) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 21 13:58:35 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:35 user nova-compute[71474]: DEBUG nova.compute.manager [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] Start spawning the instance on the hypervisor. {{(pid=71474) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 21 13:58:35 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] Creating instance directory {{(pid=71474) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 21 13:58:35 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] Creating image(s) Apr 21 13:58:35 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Acquiring lock "/opt/stack/data/nova/instances/f0f32b68-6993-4843-bcc6-bd0e06377b27/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:58:35 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Lock "/opt/stack/data/nova/instances/f0f32b68-6993-4843-bcc6-bd0e06377b27/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:58:35 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Lock "/opt/stack/data/nova/instances/f0f32b68-6993-4843-bcc6-bd0e06377b27/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:58:35 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:58:35 user nova-compute[71474]: DEBUG nova.policy [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b60caf53ee58417cb76a77c963a45ec2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '15f83d6d2c3049e9ba1ac7f04ad2ebb0', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71474) authorize /opt/stack/nova/nova/policy.py:203}} Apr 21 13:58:35 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:35 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json" returned: 0 in 0.149s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:58:35 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Acquiring lock "8e8c288cb98f22f6af31ad55f38b7baa81c260d7" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:58:35 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Lock "8e8c288cb98f22f6af31ad55f38b7baa81c260d7" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:58:35 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:58:35 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json" returned: 0 in 0.165s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:58:35 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7,backing_fmt=raw /opt/stack/data/nova/instances/f0f32b68-6993-4843-bcc6-bd0e06377b27/disk 1073741824 {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:58:35 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7,backing_fmt=raw /opt/stack/data/nova/instances/f0f32b68-6993-4843-bcc6-bd0e06377b27/disk 1073741824" returned: 0 in 0.085s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:58:35 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Lock "8e8c288cb98f22f6af31ad55f38b7baa81c260d7" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.264s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:58:35 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:58:35 user nova-compute[71474]: DEBUG nova.network.neutron [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] [instance: 0346fbd8-64cd-45e7-906f-e00eeece91ce] Successfully updated port: 94bfdf6c-66ef-44cc-ab29-c1b1597cf6a8 {{(pid=71474) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 21 13:58:35 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] Acquiring lock "refresh_cache-0346fbd8-64cd-45e7-906f-e00eeece91ce" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 13:58:35 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] Acquired lock "refresh_cache-0346fbd8-64cd-45e7-906f-e00eeece91ce" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 13:58:35 user nova-compute[71474]: DEBUG nova.network.neutron [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] [instance: 0346fbd8-64cd-45e7-906f-e00eeece91ce] Building network info cache for instance {{(pid=71474) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 21 13:58:35 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json" returned: 0 in 0.164s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:58:35 user nova-compute[71474]: DEBUG nova.virt.disk.api [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Checking if we can resize image /opt/stack/data/nova/instances/f0f32b68-6993-4843-bcc6-bd0e06377b27/disk. size=1073741824 {{(pid=71474) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 21 13:58:35 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f0f32b68-6993-4843-bcc6-bd0e06377b27/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:58:35 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:35 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f0f32b68-6993-4843-bcc6-bd0e06377b27/disk --force-share --output=json" returned: 0 in 0.189s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:58:35 user nova-compute[71474]: DEBUG nova.virt.disk.api [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Cannot resize image /opt/stack/data/nova/instances/f0f32b68-6993-4843-bcc6-bd0e06377b27/disk to a smaller size. {{(pid=71474) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 21 13:58:35 user nova-compute[71474]: DEBUG nova.objects.instance [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Lazy-loading 'migration_context' on Instance uuid f0f32b68-6993-4843-bcc6-bd0e06377b27 {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 13:58:35 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] Created local disks {{(pid=71474) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 21 13:58:35 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] Ensure instance console log exists: /opt/stack/data/nova/instances/f0f32b68-6993-4843-bcc6-bd0e06377b27/console.log {{(pid=71474) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 21 13:58:35 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:58:35 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:58:35 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:58:36 user nova-compute[71474]: DEBUG nova.compute.manager [req-74b455e0-7701-4154-ae95-0af338dbf562 req-e1875212-9944-4326-8e0c-78427f4f566b service nova] [instance: 0346fbd8-64cd-45e7-906f-e00eeece91ce] Received event network-changed-94bfdf6c-66ef-44cc-ab29-c1b1597cf6a8 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 13:58:36 user nova-compute[71474]: DEBUG nova.compute.manager [req-74b455e0-7701-4154-ae95-0af338dbf562 req-e1875212-9944-4326-8e0c-78427f4f566b service nova] [instance: 0346fbd8-64cd-45e7-906f-e00eeece91ce] Refreshing instance network info cache due to event network-changed-94bfdf6c-66ef-44cc-ab29-c1b1597cf6a8. {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 21 13:58:36 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-74b455e0-7701-4154-ae95-0af338dbf562 req-e1875212-9944-4326-8e0c-78427f4f566b service nova] Acquiring lock "refresh_cache-0346fbd8-64cd-45e7-906f-e00eeece91ce" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 13:58:36 user nova-compute[71474]: DEBUG nova.network.neutron [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] [instance: 0346fbd8-64cd-45e7-906f-e00eeece91ce] Instance cache missing network info. {{(pid=71474) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 21 13:58:37 user nova-compute[71474]: DEBUG nova.network.neutron [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] Successfully created port: 20ca5a57-3cd5-47ad-bdfe-f56a0ecd078b {{(pid=71474) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 21 13:58:37 user nova-compute[71474]: DEBUG nova.network.neutron [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] [instance: 0346fbd8-64cd-45e7-906f-e00eeece91ce] Updating instance_info_cache with network_info: [{"id": "94bfdf6c-66ef-44cc-ab29-c1b1597cf6a8", "address": "fa:16:3e:53:67:9d", "network": {"id": "d373a2f1-e506-4b6b-ab93-37e77bb02c7a", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-197870629-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1953479f081341b088eaecb8369fce0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap94bfdf6c-66", "ovs_interfaceid": "94bfdf6c-66ef-44cc-ab29-c1b1597cf6a8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 13:58:37 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] Releasing lock "refresh_cache-0346fbd8-64cd-45e7-906f-e00eeece91ce" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 13:58:37 user nova-compute[71474]: DEBUG nova.compute.manager [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] [instance: 0346fbd8-64cd-45e7-906f-e00eeece91ce] Instance network_info: |[{"id": "94bfdf6c-66ef-44cc-ab29-c1b1597cf6a8", "address": "fa:16:3e:53:67:9d", "network": {"id": "d373a2f1-e506-4b6b-ab93-37e77bb02c7a", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-197870629-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1953479f081341b088eaecb8369fce0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap94bfdf6c-66", "ovs_interfaceid": "94bfdf6c-66ef-44cc-ab29-c1b1597cf6a8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71474) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 21 13:58:37 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-74b455e0-7701-4154-ae95-0af338dbf562 req-e1875212-9944-4326-8e0c-78427f4f566b service nova] Acquired lock "refresh_cache-0346fbd8-64cd-45e7-906f-e00eeece91ce" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 13:58:37 user nova-compute[71474]: DEBUG nova.network.neutron [req-74b455e0-7701-4154-ae95-0af338dbf562 req-e1875212-9944-4326-8e0c-78427f4f566b service nova] [instance: 0346fbd8-64cd-45e7-906f-e00eeece91ce] Refreshing network info cache for port 94bfdf6c-66ef-44cc-ab29-c1b1597cf6a8 {{(pid=71474) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 21 13:58:37 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] [instance: 0346fbd8-64cd-45e7-906f-e00eeece91ce] Start _get_guest_xml network_info=[{"id": "94bfdf6c-66ef-44cc-ab29-c1b1597cf6a8", "address": "fa:16:3e:53:67:9d", "network": {"id": "d373a2f1-e506-4b6b-ab93-37e77bb02c7a", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-197870629-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1953479f081341b088eaecb8369fce0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap94bfdf6c-66", "ovs_interfaceid": "94bfdf6c-66ef-44cc-ab29-c1b1597cf6a8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'scsi', 'cdrom_bus': 'scsi', 'mapping': {'root': {'bus': 'scsi', 'dev': 'sda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'scsi', 'dev': 'sda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'scsi', 'dev': 'sdb', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T13:58:22Z,direct_url=,disk_format='qcow2',id=2be63a0e-17cf-4166-a64b-96ec2f419df8,min_disk=0,min_ram=0,name='',owner='3a7c9bf0c07648ddb22f2e6637b8be0e',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T13:58:24Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/sda', 'image': [{'size': 0, 'device_name': '/dev/sda', 'encrypted': False, 'encryption_format': None, 'disk_bus': 'scsi', 'device_type': 'disk', 'guest_format': None, 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'image_id': '2be63a0e-17cf-4166-a64b-96ec2f419df8'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71474) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 21 13:58:37 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 13:58:37 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 13:58:37 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71474) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 21 13:58:37 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-21T13:55:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T13:58:22Z,direct_url=,disk_format='qcow2',id=2be63a0e-17cf-4166-a64b-96ec2f419df8,min_disk=0,min_ram=0,name='',owner='3a7c9bf0c07648ddb22f2e6637b8be0e',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T13:58:24Z,virtual_size=,visibility=), allow threads: True {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 21 13:58:37 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] Flavor limits 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 21 13:58:37 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] Image limits 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 21 13:58:37 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] Flavor pref 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 21 13:58:37 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] Image pref 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 21 13:58:37 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 21 13:58:37 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 21 13:58:37 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71474) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 21 13:58:37 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] Got 1 possible topologies {{(pid=71474) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 21 13:58:37 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 21 13:58:37 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 21 13:58:37 user nova-compute[71474]: DEBUG nova.virt.libvirt.vif [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2023-04-21T13:58:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachSCSIVolumeTestJSON-server-704385804',display_name='tempest-AttachSCSIVolumeTestJSON-server-704385804',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachscsivolumetestjson-server-704385804',id=5,image_ref='2be63a0e-17cf-4166-a64b-96ec2f419df8',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBH1e308suG6Up9XacMAZI+/gRfmiDMDOuigLvKXnubV1WGNa08tjA4vufwk45vBBKQZeBxNXGuSgt+lX8T8X9YT0Q/e8fHr4reV0+dJepejAd/TWi7ALI+M/7BbZu2uxsg==',key_name='tempest-keypair-198905762',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1953479f081341b088eaecb8369fce0b',ramdisk_id='',reservation_id='r-26d32h24',resources=None,root_device_name='/dev/sda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2be63a0e-17cf-4166-a64b-96ec2f419df8',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='scsi',image_hw_disk_bus='scsi',image_hw_machine_type='pc',image_hw_scsi_model='virtio-scsi',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachSCSIVolumeTestJSON-1130428952',owner_user_name='tempest-AttachSCSIVolumeTestJSON-1130428952-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T13:58:32Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='7ac1dc66f96249f884f9b4dbc73f37b4',uuid=0346fbd8-64cd-45e7-906f-e00eeece91ce,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "94bfdf6c-66ef-44cc-ab29-c1b1597cf6a8", "address": "fa:16:3e:53:67:9d", "network": {"id": "d373a2f1-e506-4b6b-ab93-37e77bb02c7a", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-197870629-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1953479f081341b088eaecb8369fce0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap94bfdf6c-66", "ovs_interfaceid": "94bfdf6c-66ef-44cc-ab29-c1b1597cf6a8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71474) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 21 13:58:37 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] Converting VIF {"id": "94bfdf6c-66ef-44cc-ab29-c1b1597cf6a8", "address": "fa:16:3e:53:67:9d", "network": {"id": "d373a2f1-e506-4b6b-ab93-37e77bb02c7a", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-197870629-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1953479f081341b088eaecb8369fce0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap94bfdf6c-66", "ovs_interfaceid": "94bfdf6c-66ef-44cc-ab29-c1b1597cf6a8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 13:58:37 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:53:67:9d,bridge_name='br-int',has_traffic_filtering=True,id=94bfdf6c-66ef-44cc-ab29-c1b1597cf6a8,network=Network(d373a2f1-e506-4b6b-ab93-37e77bb02c7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap94bfdf6c-66') {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 13:58:37 user nova-compute[71474]: DEBUG nova.objects.instance [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] Lazy-loading 'pci_devices' on Instance uuid 0346fbd8-64cd-45e7-906f-e00eeece91ce {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 13:58:37 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] [instance: 0346fbd8-64cd-45e7-906f-e00eeece91ce] End _get_guest_xml xml= Apr 21 13:58:37 user nova-compute[71474]: 0346fbd8-64cd-45e7-906f-e00eeece91ce Apr 21 13:58:37 user nova-compute[71474]: instance-00000005 Apr 21 13:58:37 user nova-compute[71474]: 131072 Apr 21 13:58:37 user nova-compute[71474]: 1 Apr 21 13:58:37 user nova-compute[71474]: Apr 21 13:58:37 user nova-compute[71474]: Apr 21 13:58:37 user nova-compute[71474]: Apr 21 13:58:37 user nova-compute[71474]: tempest-AttachSCSIVolumeTestJSON-server-704385804 Apr 21 13:58:37 user nova-compute[71474]: 2023-04-21 13:58:37 Apr 21 13:58:37 user nova-compute[71474]: Apr 21 13:58:37 user nova-compute[71474]: 128 Apr 21 13:58:37 user nova-compute[71474]: 1 Apr 21 13:58:37 user nova-compute[71474]: 0 Apr 21 13:58:37 user nova-compute[71474]: 0 Apr 21 13:58:37 user nova-compute[71474]: 1 Apr 21 13:58:37 user nova-compute[71474]: Apr 21 13:58:37 user nova-compute[71474]: Apr 21 13:58:37 user nova-compute[71474]: tempest-AttachSCSIVolumeTestJSON-1130428952-project-member Apr 21 13:58:37 user nova-compute[71474]: tempest-AttachSCSIVolumeTestJSON-1130428952 Apr 21 13:58:37 user nova-compute[71474]: Apr 21 13:58:37 user nova-compute[71474]: Apr 21 13:58:37 user nova-compute[71474]: Apr 21 13:58:37 user nova-compute[71474]: Apr 21 13:58:37 user nova-compute[71474]: Apr 21 13:58:37 user nova-compute[71474]: Apr 21 13:58:37 user nova-compute[71474]: Apr 21 13:58:37 user nova-compute[71474]: Apr 21 13:58:37 user nova-compute[71474]: Apr 21 13:58:37 user nova-compute[71474]: Apr 21 13:58:37 user nova-compute[71474]: Apr 21 13:58:37 user nova-compute[71474]: OpenStack Foundation Apr 21 13:58:37 user nova-compute[71474]: OpenStack Nova Apr 21 13:58:37 user nova-compute[71474]: 0.0.0 Apr 21 13:58:37 user nova-compute[71474]: 0346fbd8-64cd-45e7-906f-e00eeece91ce Apr 21 13:58:37 user nova-compute[71474]: 0346fbd8-64cd-45e7-906f-e00eeece91ce Apr 21 13:58:37 user nova-compute[71474]: Virtual Machine Apr 21 13:58:37 user nova-compute[71474]: Apr 21 13:58:37 user nova-compute[71474]: Apr 21 13:58:37 user nova-compute[71474]: Apr 21 13:58:37 user nova-compute[71474]: hvm Apr 21 13:58:37 user nova-compute[71474]: Apr 21 13:58:37 user nova-compute[71474]: Apr 21 13:58:37 user nova-compute[71474]: Apr 21 13:58:37 user nova-compute[71474]: Apr 21 13:58:37 user nova-compute[71474]: Apr 21 13:58:37 user nova-compute[71474]: Apr 21 13:58:37 user nova-compute[71474]: Apr 21 13:58:37 user nova-compute[71474]: Apr 21 13:58:37 user nova-compute[71474]: Apr 21 13:58:37 user nova-compute[71474]: Apr 21 13:58:37 user nova-compute[71474]: Apr 21 13:58:37 user nova-compute[71474]: Apr 21 13:58:37 user nova-compute[71474]: Apr 21 13:58:37 user nova-compute[71474]: Apr 21 13:58:37 user nova-compute[71474]: Nehalem Apr 21 13:58:37 user nova-compute[71474]: Apr 21 13:58:37 user nova-compute[71474]: Apr 21 13:58:37 user nova-compute[71474]: Apr 21 13:58:37 user nova-compute[71474]: Apr 21 13:58:37 user nova-compute[71474]: Apr 21 13:58:37 user nova-compute[71474]: Apr 21 13:58:37 user nova-compute[71474]: Apr 21 13:58:37 user nova-compute[71474]:
Apr 21 13:58:37 user nova-compute[71474]: Apr 21 13:58:37 user nova-compute[71474]: Apr 21 13:58:37 user nova-compute[71474]: Apr 21 13:58:37 user nova-compute[71474]: Apr 21 13:58:37 user nova-compute[71474]: Apr 21 13:58:37 user nova-compute[71474]:
Apr 21 13:58:37 user nova-compute[71474]: Apr 21 13:58:37 user nova-compute[71474]: Apr 21 13:58:37 user nova-compute[71474]: Apr 21 13:58:37 user nova-compute[71474]: Apr 21 13:58:37 user nova-compute[71474]: Apr 21 13:58:37 user nova-compute[71474]: Apr 21 13:58:37 user nova-compute[71474]: Apr 21 13:58:37 user nova-compute[71474]: Apr 21 13:58:37 user nova-compute[71474]: Apr 21 13:58:37 user nova-compute[71474]: Apr 21 13:58:37 user nova-compute[71474]: Apr 21 13:58:37 user nova-compute[71474]: Apr 21 13:58:37 user nova-compute[71474]: Apr 21 13:58:37 user nova-compute[71474]: Apr 21 13:58:37 user nova-compute[71474]: /dev/urandom Apr 21 13:58:37 user nova-compute[71474]: Apr 21 13:58:37 user nova-compute[71474]: Apr 21 13:58:37 user nova-compute[71474]: Apr 21 13:58:37 user nova-compute[71474]: Apr 21 13:58:37 user nova-compute[71474]: Apr 21 13:58:37 user nova-compute[71474]: Apr 21 13:58:37 user nova-compute[71474]: Apr 21 13:58:37 user nova-compute[71474]: {{(pid=71474) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 21 13:58:37 user nova-compute[71474]: DEBUG nova.virt.libvirt.vif [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2023-04-21T13:58:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachSCSIVolumeTestJSON-server-704385804',display_name='tempest-AttachSCSIVolumeTestJSON-server-704385804',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachscsivolumetestjson-server-704385804',id=5,image_ref='2be63a0e-17cf-4166-a64b-96ec2f419df8',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBH1e308suG6Up9XacMAZI+/gRfmiDMDOuigLvKXnubV1WGNa08tjA4vufwk45vBBKQZeBxNXGuSgt+lX8T8X9YT0Q/e8fHr4reV0+dJepejAd/TWi7ALI+M/7BbZu2uxsg==',key_name='tempest-keypair-198905762',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1953479f081341b088eaecb8369fce0b',ramdisk_id='',reservation_id='r-26d32h24',resources=None,root_device_name='/dev/sda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2be63a0e-17cf-4166-a64b-96ec2f419df8',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='scsi',image_hw_disk_bus='scsi',image_hw_machine_type='pc',image_hw_scsi_model='virtio-scsi',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachSCSIVolumeTestJSON-1130428952',owner_user_name='tempest-AttachSCSIVolumeTestJSON-1130428952-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T13:58:32Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='7ac1dc66f96249f884f9b4dbc73f37b4',uuid=0346fbd8-64cd-45e7-906f-e00eeece91ce,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "94bfdf6c-66ef-44cc-ab29-c1b1597cf6a8", "address": "fa:16:3e:53:67:9d", "network": {"id": "d373a2f1-e506-4b6b-ab93-37e77bb02c7a", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-197870629-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1953479f081341b088eaecb8369fce0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap94bfdf6c-66", "ovs_interfaceid": "94bfdf6c-66ef-44cc-ab29-c1b1597cf6a8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 21 13:58:37 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] Converting VIF {"id": "94bfdf6c-66ef-44cc-ab29-c1b1597cf6a8", "address": "fa:16:3e:53:67:9d", "network": {"id": "d373a2f1-e506-4b6b-ab93-37e77bb02c7a", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-197870629-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1953479f081341b088eaecb8369fce0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap94bfdf6c-66", "ovs_interfaceid": "94bfdf6c-66ef-44cc-ab29-c1b1597cf6a8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 13:58:37 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:53:67:9d,bridge_name='br-int',has_traffic_filtering=True,id=94bfdf6c-66ef-44cc-ab29-c1b1597cf6a8,network=Network(d373a2f1-e506-4b6b-ab93-37e77bb02c7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap94bfdf6c-66') {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 13:58:37 user nova-compute[71474]: DEBUG os_vif [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:53:67:9d,bridge_name='br-int',has_traffic_filtering=True,id=94bfdf6c-66ef-44cc-ab29-c1b1597cf6a8,network=Network(d373a2f1-e506-4b6b-ab93-37e77bb02c7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap94bfdf6c-66') {{(pid=71474) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 21 13:58:37 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:37 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 13:58:37 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 21 13:58:37 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:37 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap94bfdf6c-66, may_exist=True) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 13:58:37 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap94bfdf6c-66, col_values=(('external_ids', {'iface-id': '94bfdf6c-66ef-44cc-ab29-c1b1597cf6a8', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:53:67:9d', 'vm-uuid': '0346fbd8-64cd-45e7-906f-e00eeece91ce'}),)) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 13:58:37 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:37 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 13:58:37 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:37 user nova-compute[71474]: INFO os_vif [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:53:67:9d,bridge_name='br-int',has_traffic_filtering=True,id=94bfdf6c-66ef-44cc-ab29-c1b1597cf6a8,network=Network(d373a2f1-e506-4b6b-ab93-37e77bb02c7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap94bfdf6c-66') Apr 21 13:58:37 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] No BDM found with device name sda, not building metadata. {{(pid=71474) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 21 13:58:37 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] No BDM found with device name sdb, not building metadata. {{(pid=71474) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 21 13:58:37 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] No VIF found with MAC fa:16:3e:53:67:9d, not building metadata {{(pid=71474) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 21 13:58:37 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] [instance: 0346fbd8-64cd-45e7-906f-e00eeece91ce] Using config drive Apr 21 13:58:37 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] [instance: 0346fbd8-64cd-45e7-906f-e00eeece91ce] Creating config drive at /opt/stack/data/nova/instances/0346fbd8-64cd-45e7-906f-e00eeece91ce/disk.config Apr 21 13:58:37 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] Running cmd (subprocess): genisoimage -o /opt/stack/data/nova/instances/0346fbd8-64cd-45e7-906f-e00eeece91ce/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 0.0.0 -quiet -J -r -V config-2 /tmp/tmp1q8b3kw1 {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:58:38 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] CMD "genisoimage -o /opt/stack/data/nova/instances/0346fbd8-64cd-45e7-906f-e00eeece91ce/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 0.0.0 -quiet -J -r -V config-2 /tmp/tmp1q8b3kw1" returned: 0 in 0.061s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:58:38 user nova-compute[71474]: DEBUG nova.network.neutron [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] Successfully updated port: 20ca5a57-3cd5-47ad-bdfe-f56a0ecd078b {{(pid=71474) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 21 13:58:38 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Acquiring lock "refresh_cache-f0f32b68-6993-4843-bcc6-bd0e06377b27" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 13:58:38 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Acquired lock "refresh_cache-f0f32b68-6993-4843-bcc6-bd0e06377b27" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 13:58:38 user nova-compute[71474]: DEBUG nova.network.neutron [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] Building network info cache for instance {{(pid=71474) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 21 13:58:38 user nova-compute[71474]: DEBUG nova.compute.manager [req-9fa5d826-754e-4689-b155-f17041dcaf93 req-5272e368-89d6-4345-9812-2d41c4618f9e service nova] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] Received event network-changed-20ca5a57-3cd5-47ad-bdfe-f56a0ecd078b {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 13:58:38 user nova-compute[71474]: DEBUG nova.compute.manager [req-9fa5d826-754e-4689-b155-f17041dcaf93 req-5272e368-89d6-4345-9812-2d41c4618f9e service nova] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] Refreshing instance network info cache due to event network-changed-20ca5a57-3cd5-47ad-bdfe-f56a0ecd078b. {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 21 13:58:38 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-9fa5d826-754e-4689-b155-f17041dcaf93 req-5272e368-89d6-4345-9812-2d41c4618f9e service nova] Acquiring lock "refresh_cache-f0f32b68-6993-4843-bcc6-bd0e06377b27" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 13:58:38 user nova-compute[71474]: DEBUG nova.network.neutron [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] Instance cache missing network info. {{(pid=71474) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 21 13:58:39 user nova-compute[71474]: DEBUG nova.network.neutron [req-74b455e0-7701-4154-ae95-0af338dbf562 req-e1875212-9944-4326-8e0c-78427f4f566b service nova] [instance: 0346fbd8-64cd-45e7-906f-e00eeece91ce] Updated VIF entry in instance network info cache for port 94bfdf6c-66ef-44cc-ab29-c1b1597cf6a8. {{(pid=71474) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 21 13:58:39 user nova-compute[71474]: DEBUG nova.network.neutron [req-74b455e0-7701-4154-ae95-0af338dbf562 req-e1875212-9944-4326-8e0c-78427f4f566b service nova] [instance: 0346fbd8-64cd-45e7-906f-e00eeece91ce] Updating instance_info_cache with network_info: [{"id": "94bfdf6c-66ef-44cc-ab29-c1b1597cf6a8", "address": "fa:16:3e:53:67:9d", "network": {"id": "d373a2f1-e506-4b6b-ab93-37e77bb02c7a", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-197870629-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1953479f081341b088eaecb8369fce0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap94bfdf6c-66", "ovs_interfaceid": "94bfdf6c-66ef-44cc-ab29-c1b1597cf6a8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 13:58:39 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-74b455e0-7701-4154-ae95-0af338dbf562 req-e1875212-9944-4326-8e0c-78427f4f566b service nova] Releasing lock "refresh_cache-0346fbd8-64cd-45e7-906f-e00eeece91ce" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 13:58:39 user nova-compute[71474]: DEBUG nova.network.neutron [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] Updating instance_info_cache with network_info: [{"id": "20ca5a57-3cd5-47ad-bdfe-f56a0ecd078b", "address": "fa:16:3e:02:78:94", "network": {"id": "6e372a6f-6444-4977-be86-7a6bb86d8979", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-2058149994-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "15f83d6d2c3049e9ba1ac7f04ad2ebb0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap20ca5a57-3c", "ovs_interfaceid": "20ca5a57-3cd5-47ad-bdfe-f56a0ecd078b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 13:58:39 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Releasing lock "refresh_cache-f0f32b68-6993-4843-bcc6-bd0e06377b27" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 13:58:39 user nova-compute[71474]: DEBUG nova.compute.manager [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] Instance network_info: |[{"id": "20ca5a57-3cd5-47ad-bdfe-f56a0ecd078b", "address": "fa:16:3e:02:78:94", "network": {"id": "6e372a6f-6444-4977-be86-7a6bb86d8979", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-2058149994-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "15f83d6d2c3049e9ba1ac7f04ad2ebb0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap20ca5a57-3c", "ovs_interfaceid": "20ca5a57-3cd5-47ad-bdfe-f56a0ecd078b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71474) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 21 13:58:39 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-9fa5d826-754e-4689-b155-f17041dcaf93 req-5272e368-89d6-4345-9812-2d41c4618f9e service nova] Acquired lock "refresh_cache-f0f32b68-6993-4843-bcc6-bd0e06377b27" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 13:58:39 user nova-compute[71474]: DEBUG nova.network.neutron [req-9fa5d826-754e-4689-b155-f17041dcaf93 req-5272e368-89d6-4345-9812-2d41c4618f9e service nova] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] Refreshing network info cache for port 20ca5a57-3cd5-47ad-bdfe-f56a0ecd078b {{(pid=71474) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 21 13:58:39 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] Start _get_guest_xml network_info=[{"id": "20ca5a57-3cd5-47ad-bdfe-f56a0ecd078b", "address": "fa:16:3e:02:78:94", "network": {"id": "6e372a6f-6444-4977-be86-7a6bb86d8979", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-2058149994-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "15f83d6d2c3049e9ba1ac7f04ad2ebb0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap20ca5a57-3c", "ovs_interfaceid": "20ca5a57-3cd5-47ad-bdfe-f56a0ecd078b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T13:54:16Z,direct_url=,disk_format='qcow2',id=2edfef44-2867-4e03-a53e-b139f99afa75,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='36a44032fda748c1965c722304fa176d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T13:54:18Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'guest_format': None, 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'image_id': '2edfef44-2867-4e03-a53e-b139f99afa75'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71474) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 21 13:58:39 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 13:58:39 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 13:58:39 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71474) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 21 13:58:39 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-21T13:55:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T13:54:16Z,direct_url=,disk_format='qcow2',id=2edfef44-2867-4e03-a53e-b139f99afa75,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='36a44032fda748c1965c722304fa176d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T13:54:18Z,virtual_size=,visibility=), allow threads: True {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 21 13:58:39 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Flavor limits 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 21 13:58:39 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Image limits 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 21 13:58:39 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Flavor pref 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 21 13:58:39 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Image pref 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 21 13:58:39 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 21 13:58:39 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 21 13:58:39 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71474) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 21 13:58:39 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Got 1 possible topologies {{(pid=71474) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 21 13:58:39 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 21 13:58:39 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 21 13:58:39 user nova-compute[71474]: DEBUG nova.virt.libvirt.vif [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T13:58:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesAdminNegativeTest-server-1193021950',display_name='tempest-VolumesAdminNegativeTest-server-1193021950',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesadminnegativetest-server-1193021950',id=6,image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOhmqZ33jzOJUNp5cIjTjD2V4mwqGnUNtzXSj78uvtldCN9y9LKEaKBdycKDs4VYN2v9RCyHrUj9yHjgYAuNS07yjzech5h1dSQg5dt5ELnEas6naL+mLGQFJzls0JQplQ==',key_name='tempest-keypair-1507777159',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='15f83d6d2c3049e9ba1ac7f04ad2ebb0',ramdisk_id='',reservation_id='r-a6d80ryc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-VolumesAdminNegativeTest-1182596808',owner_user_name='tempest-VolumesAdminNegativeTest-1182596808-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T13:58:35Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='b60caf53ee58417cb76a77c963a45ec2',uuid=f0f32b68-6993-4843-bcc6-bd0e06377b27,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "20ca5a57-3cd5-47ad-bdfe-f56a0ecd078b", "address": "fa:16:3e:02:78:94", "network": {"id": "6e372a6f-6444-4977-be86-7a6bb86d8979", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-2058149994-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "15f83d6d2c3049e9ba1ac7f04ad2ebb0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap20ca5a57-3c", "ovs_interfaceid": "20ca5a57-3cd5-47ad-bdfe-f56a0ecd078b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71474) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 21 13:58:39 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Converting VIF {"id": "20ca5a57-3cd5-47ad-bdfe-f56a0ecd078b", "address": "fa:16:3e:02:78:94", "network": {"id": "6e372a6f-6444-4977-be86-7a6bb86d8979", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-2058149994-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "15f83d6d2c3049e9ba1ac7f04ad2ebb0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap20ca5a57-3c", "ovs_interfaceid": "20ca5a57-3cd5-47ad-bdfe-f56a0ecd078b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 13:58:39 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:02:78:94,bridge_name='br-int',has_traffic_filtering=True,id=20ca5a57-3cd5-47ad-bdfe-f56a0ecd078b,network=Network(6e372a6f-6444-4977-be86-7a6bb86d8979),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap20ca5a57-3c') {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 13:58:39 user nova-compute[71474]: DEBUG nova.objects.instance [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Lazy-loading 'pci_devices' on Instance uuid f0f32b68-6993-4843-bcc6-bd0e06377b27 {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 13:58:39 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] End _get_guest_xml xml= Apr 21 13:58:39 user nova-compute[71474]: f0f32b68-6993-4843-bcc6-bd0e06377b27 Apr 21 13:58:39 user nova-compute[71474]: instance-00000006 Apr 21 13:58:39 user nova-compute[71474]: 131072 Apr 21 13:58:39 user nova-compute[71474]: 1 Apr 21 13:58:39 user nova-compute[71474]: Apr 21 13:58:39 user nova-compute[71474]: Apr 21 13:58:39 user nova-compute[71474]: Apr 21 13:58:39 user nova-compute[71474]: tempest-VolumesAdminNegativeTest-server-1193021950 Apr 21 13:58:39 user nova-compute[71474]: 2023-04-21 13:58:39 Apr 21 13:58:39 user nova-compute[71474]: Apr 21 13:58:39 user nova-compute[71474]: 128 Apr 21 13:58:39 user nova-compute[71474]: 1 Apr 21 13:58:39 user nova-compute[71474]: 0 Apr 21 13:58:39 user nova-compute[71474]: 0 Apr 21 13:58:39 user nova-compute[71474]: 1 Apr 21 13:58:39 user nova-compute[71474]: Apr 21 13:58:39 user nova-compute[71474]: Apr 21 13:58:39 user nova-compute[71474]: tempest-VolumesAdminNegativeTest-1182596808-project-member Apr 21 13:58:39 user nova-compute[71474]: tempest-VolumesAdminNegativeTest-1182596808 Apr 21 13:58:39 user nova-compute[71474]: Apr 21 13:58:39 user nova-compute[71474]: Apr 21 13:58:39 user nova-compute[71474]: Apr 21 13:58:39 user nova-compute[71474]: Apr 21 13:58:39 user nova-compute[71474]: Apr 21 13:58:39 user nova-compute[71474]: Apr 21 13:58:39 user nova-compute[71474]: Apr 21 13:58:39 user nova-compute[71474]: Apr 21 13:58:39 user nova-compute[71474]: Apr 21 13:58:39 user nova-compute[71474]: Apr 21 13:58:39 user nova-compute[71474]: Apr 21 13:58:39 user nova-compute[71474]: OpenStack Foundation Apr 21 13:58:39 user nova-compute[71474]: OpenStack Nova Apr 21 13:58:39 user nova-compute[71474]: 0.0.0 Apr 21 13:58:39 user nova-compute[71474]: f0f32b68-6993-4843-bcc6-bd0e06377b27 Apr 21 13:58:39 user nova-compute[71474]: f0f32b68-6993-4843-bcc6-bd0e06377b27 Apr 21 13:58:39 user nova-compute[71474]: Virtual Machine Apr 21 13:58:39 user nova-compute[71474]: Apr 21 13:58:39 user nova-compute[71474]: Apr 21 13:58:39 user nova-compute[71474]: Apr 21 13:58:39 user nova-compute[71474]: hvm Apr 21 13:58:39 user nova-compute[71474]: Apr 21 13:58:39 user nova-compute[71474]: Apr 21 13:58:39 user nova-compute[71474]: Apr 21 13:58:39 user nova-compute[71474]: Apr 21 13:58:39 user nova-compute[71474]: Apr 21 13:58:39 user nova-compute[71474]: Apr 21 13:58:39 user nova-compute[71474]: Apr 21 13:58:39 user nova-compute[71474]: Apr 21 13:58:39 user nova-compute[71474]: Apr 21 13:58:39 user nova-compute[71474]: Apr 21 13:58:39 user nova-compute[71474]: Apr 21 13:58:39 user nova-compute[71474]: Apr 21 13:58:39 user nova-compute[71474]: Apr 21 13:58:39 user nova-compute[71474]: Apr 21 13:58:39 user nova-compute[71474]: Nehalem Apr 21 13:58:39 user nova-compute[71474]: Apr 21 13:58:39 user nova-compute[71474]: Apr 21 13:58:39 user nova-compute[71474]: Apr 21 13:58:39 user nova-compute[71474]: Apr 21 13:58:39 user nova-compute[71474]: Apr 21 13:58:39 user nova-compute[71474]: Apr 21 13:58:39 user nova-compute[71474]: Apr 21 13:58:39 user nova-compute[71474]: Apr 21 13:58:39 user nova-compute[71474]: Apr 21 13:58:39 user nova-compute[71474]: Apr 21 13:58:39 user nova-compute[71474]: Apr 21 13:58:39 user nova-compute[71474]: Apr 21 13:58:39 user nova-compute[71474]: Apr 21 13:58:39 user nova-compute[71474]: Apr 21 13:58:39 user nova-compute[71474]: Apr 21 13:58:39 user nova-compute[71474]: Apr 21 13:58:39 user nova-compute[71474]: Apr 21 13:58:39 user nova-compute[71474]: Apr 21 13:58:39 user nova-compute[71474]: Apr 21 13:58:39 user nova-compute[71474]: Apr 21 13:58:39 user nova-compute[71474]: /dev/urandom Apr 21 13:58:39 user nova-compute[71474]: Apr 21 13:58:39 user nova-compute[71474]: Apr 21 13:58:39 user nova-compute[71474]: Apr 21 13:58:39 user nova-compute[71474]: Apr 21 13:58:39 user nova-compute[71474]: Apr 21 13:58:39 user nova-compute[71474]: Apr 21 13:58:39 user nova-compute[71474]: Apr 21 13:58:39 user nova-compute[71474]: {{(pid=71474) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 21 13:58:39 user nova-compute[71474]: DEBUG nova.virt.libvirt.vif [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T13:58:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesAdminNegativeTest-server-1193021950',display_name='tempest-VolumesAdminNegativeTest-server-1193021950',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesadminnegativetest-server-1193021950',id=6,image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOhmqZ33jzOJUNp5cIjTjD2V4mwqGnUNtzXSj78uvtldCN9y9LKEaKBdycKDs4VYN2v9RCyHrUj9yHjgYAuNS07yjzech5h1dSQg5dt5ELnEas6naL+mLGQFJzls0JQplQ==',key_name='tempest-keypair-1507777159',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='15f83d6d2c3049e9ba1ac7f04ad2ebb0',ramdisk_id='',reservation_id='r-a6d80ryc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-VolumesAdminNegativeTest-1182596808',owner_user_name='tempest-VolumesAdminNegativeTest-1182596808-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T13:58:35Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='b60caf53ee58417cb76a77c963a45ec2',uuid=f0f32b68-6993-4843-bcc6-bd0e06377b27,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "20ca5a57-3cd5-47ad-bdfe-f56a0ecd078b", "address": "fa:16:3e:02:78:94", "network": {"id": "6e372a6f-6444-4977-be86-7a6bb86d8979", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-2058149994-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "15f83d6d2c3049e9ba1ac7f04ad2ebb0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap20ca5a57-3c", "ovs_interfaceid": "20ca5a57-3cd5-47ad-bdfe-f56a0ecd078b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 21 13:58:39 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Converting VIF {"id": "20ca5a57-3cd5-47ad-bdfe-f56a0ecd078b", "address": "fa:16:3e:02:78:94", "network": {"id": "6e372a6f-6444-4977-be86-7a6bb86d8979", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-2058149994-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "15f83d6d2c3049e9ba1ac7f04ad2ebb0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap20ca5a57-3c", "ovs_interfaceid": "20ca5a57-3cd5-47ad-bdfe-f56a0ecd078b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 13:58:39 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:02:78:94,bridge_name='br-int',has_traffic_filtering=True,id=20ca5a57-3cd5-47ad-bdfe-f56a0ecd078b,network=Network(6e372a6f-6444-4977-be86-7a6bb86d8979),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap20ca5a57-3c') {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 13:58:39 user nova-compute[71474]: DEBUG os_vif [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:02:78:94,bridge_name='br-int',has_traffic_filtering=True,id=20ca5a57-3cd5-47ad-bdfe-f56a0ecd078b,network=Network(6e372a6f-6444-4977-be86-7a6bb86d8979),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap20ca5a57-3c') {{(pid=71474) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 21 13:58:39 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:39 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 13:58:39 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 21 13:58:39 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:39 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap20ca5a57-3c, may_exist=True) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 13:58:39 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap20ca5a57-3c, col_values=(('external_ids', {'iface-id': '20ca5a57-3cd5-47ad-bdfe-f56a0ecd078b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:02:78:94', 'vm-uuid': 'f0f32b68-6993-4843-bcc6-bd0e06377b27'}),)) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 13:58:39 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:39 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 13:58:39 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:39 user nova-compute[71474]: INFO os_vif [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:02:78:94,bridge_name='br-int',has_traffic_filtering=True,id=20ca5a57-3cd5-47ad-bdfe-f56a0ecd078b,network=Network(6e372a6f-6444-4977-be86-7a6bb86d8979),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap20ca5a57-3c') Apr 21 13:58:39 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:39 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:39 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:39 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] No BDM found with device name vda, not building metadata. {{(pid=71474) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 21 13:58:39 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] No VIF found with MAC fa:16:3e:02:78:94, not building metadata {{(pid=71474) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 21 13:58:40 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:40 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:40 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:40 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:40 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:40 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:40 user nova-compute[71474]: DEBUG nova.network.neutron [req-9fa5d826-754e-4689-b155-f17041dcaf93 req-5272e368-89d6-4345-9812-2d41c4618f9e service nova] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] Updated VIF entry in instance network info cache for port 20ca5a57-3cd5-47ad-bdfe-f56a0ecd078b. {{(pid=71474) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 21 13:58:40 user nova-compute[71474]: DEBUG nova.network.neutron [req-9fa5d826-754e-4689-b155-f17041dcaf93 req-5272e368-89d6-4345-9812-2d41c4618f9e service nova] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] Updating instance_info_cache with network_info: [{"id": "20ca5a57-3cd5-47ad-bdfe-f56a0ecd078b", "address": "fa:16:3e:02:78:94", "network": {"id": "6e372a6f-6444-4977-be86-7a6bb86d8979", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-2058149994-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "15f83d6d2c3049e9ba1ac7f04ad2ebb0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap20ca5a57-3c", "ovs_interfaceid": "20ca5a57-3cd5-47ad-bdfe-f56a0ecd078b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 13:58:40 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-9fa5d826-754e-4689-b155-f17041dcaf93 req-5272e368-89d6-4345-9812-2d41c4618f9e service nova] Releasing lock "refresh_cache-f0f32b68-6993-4843-bcc6-bd0e06377b27" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 13:58:40 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:40 user nova-compute[71474]: DEBUG nova.compute.manager [req-7e1cfa8b-ea77-47d9-9b52-2847e9c7f57b req-947fed86-e757-4639-87f8-eb90acfcb709 service nova] [instance: 0346fbd8-64cd-45e7-906f-e00eeece91ce] Received event network-vif-plugged-94bfdf6c-66ef-44cc-ab29-c1b1597cf6a8 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 13:58:40 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-7e1cfa8b-ea77-47d9-9b52-2847e9c7f57b req-947fed86-e757-4639-87f8-eb90acfcb709 service nova] Acquiring lock "0346fbd8-64cd-45e7-906f-e00eeece91ce-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:58:40 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-7e1cfa8b-ea77-47d9-9b52-2847e9c7f57b req-947fed86-e757-4639-87f8-eb90acfcb709 service nova] Lock "0346fbd8-64cd-45e7-906f-e00eeece91ce-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:58:40 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-7e1cfa8b-ea77-47d9-9b52-2847e9c7f57b req-947fed86-e757-4639-87f8-eb90acfcb709 service nova] Lock "0346fbd8-64cd-45e7-906f-e00eeece91ce-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:58:40 user nova-compute[71474]: DEBUG nova.compute.manager [req-7e1cfa8b-ea77-47d9-9b52-2847e9c7f57b req-947fed86-e757-4639-87f8-eb90acfcb709 service nova] [instance: 0346fbd8-64cd-45e7-906f-e00eeece91ce] No waiting events found dispatching network-vif-plugged-94bfdf6c-66ef-44cc-ab29-c1b1597cf6a8 {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 13:58:40 user nova-compute[71474]: WARNING nova.compute.manager [req-7e1cfa8b-ea77-47d9-9b52-2847e9c7f57b req-947fed86-e757-4639-87f8-eb90acfcb709 service nova] [instance: 0346fbd8-64cd-45e7-906f-e00eeece91ce] Received unexpected event network-vif-plugged-94bfdf6c-66ef-44cc-ab29-c1b1597cf6a8 for instance with vm_state building and task_state spawning. Apr 21 13:58:40 user nova-compute[71474]: DEBUG nova.compute.manager [req-7e1cfa8b-ea77-47d9-9b52-2847e9c7f57b req-947fed86-e757-4639-87f8-eb90acfcb709 service nova] [instance: 0346fbd8-64cd-45e7-906f-e00eeece91ce] Received event network-vif-plugged-94bfdf6c-66ef-44cc-ab29-c1b1597cf6a8 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 13:58:40 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-7e1cfa8b-ea77-47d9-9b52-2847e9c7f57b req-947fed86-e757-4639-87f8-eb90acfcb709 service nova] Acquiring lock "0346fbd8-64cd-45e7-906f-e00eeece91ce-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:58:40 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-7e1cfa8b-ea77-47d9-9b52-2847e9c7f57b req-947fed86-e757-4639-87f8-eb90acfcb709 service nova] Lock "0346fbd8-64cd-45e7-906f-e00eeece91ce-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:58:40 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-7e1cfa8b-ea77-47d9-9b52-2847e9c7f57b req-947fed86-e757-4639-87f8-eb90acfcb709 service nova] Lock "0346fbd8-64cd-45e7-906f-e00eeece91ce-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:58:40 user nova-compute[71474]: DEBUG nova.compute.manager [req-7e1cfa8b-ea77-47d9-9b52-2847e9c7f57b req-947fed86-e757-4639-87f8-eb90acfcb709 service nova] [instance: 0346fbd8-64cd-45e7-906f-e00eeece91ce] No waiting events found dispatching network-vif-plugged-94bfdf6c-66ef-44cc-ab29-c1b1597cf6a8 {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 13:58:40 user nova-compute[71474]: WARNING nova.compute.manager [req-7e1cfa8b-ea77-47d9-9b52-2847e9c7f57b req-947fed86-e757-4639-87f8-eb90acfcb709 service nova] [instance: 0346fbd8-64cd-45e7-906f-e00eeece91ce] Received unexpected event network-vif-plugged-94bfdf6c-66ef-44cc-ab29-c1b1597cf6a8 for instance with vm_state building and task_state spawning. Apr 21 13:58:40 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:41 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:41 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:41 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:41 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:41 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:41 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:42 user nova-compute[71474]: DEBUG nova.compute.manager [req-980e8cb4-947e-49bc-9159-5d7301c4dd7b req-e2b01c8c-dbf0-4065-920c-76d03bd11fce service nova] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] Received event network-vif-plugged-20ca5a57-3cd5-47ad-bdfe-f56a0ecd078b {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 13:58:42 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-980e8cb4-947e-49bc-9159-5d7301c4dd7b req-e2b01c8c-dbf0-4065-920c-76d03bd11fce service nova] Acquiring lock "f0f32b68-6993-4843-bcc6-bd0e06377b27-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:58:42 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-980e8cb4-947e-49bc-9159-5d7301c4dd7b req-e2b01c8c-dbf0-4065-920c-76d03bd11fce service nova] Lock "f0f32b68-6993-4843-bcc6-bd0e06377b27-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:58:42 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-980e8cb4-947e-49bc-9159-5d7301c4dd7b req-e2b01c8c-dbf0-4065-920c-76d03bd11fce service nova] Lock "f0f32b68-6993-4843-bcc6-bd0e06377b27-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:58:42 user nova-compute[71474]: DEBUG nova.compute.manager [req-980e8cb4-947e-49bc-9159-5d7301c4dd7b req-e2b01c8c-dbf0-4065-920c-76d03bd11fce service nova] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] No waiting events found dispatching network-vif-plugged-20ca5a57-3cd5-47ad-bdfe-f56a0ecd078b {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 13:58:42 user nova-compute[71474]: WARNING nova.compute.manager [req-980e8cb4-947e-49bc-9159-5d7301c4dd7b req-e2b01c8c-dbf0-4065-920c-76d03bd11fce service nova] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] Received unexpected event network-vif-plugged-20ca5a57-3cd5-47ad-bdfe-f56a0ecd078b for instance with vm_state building and task_state spawning. Apr 21 13:58:42 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:42 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:43 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:43 user nova-compute[71474]: DEBUG nova.virt.driver [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Emitting event Resumed> {{(pid=71474) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 13:58:43 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 0346fbd8-64cd-45e7-906f-e00eeece91ce] VM Resumed (Lifecycle Event) Apr 21 13:58:43 user nova-compute[71474]: DEBUG nova.compute.manager [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] [instance: 0346fbd8-64cd-45e7-906f-e00eeece91ce] Instance event wait completed in 0 seconds for {{(pid=71474) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 21 13:58:43 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] [instance: 0346fbd8-64cd-45e7-906f-e00eeece91ce] Guest created on hypervisor {{(pid=71474) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 21 13:58:43 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 0346fbd8-64cd-45e7-906f-e00eeece91ce] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 13:58:43 user nova-compute[71474]: INFO nova.virt.libvirt.driver [-] [instance: 0346fbd8-64cd-45e7-906f-e00eeece91ce] Instance spawned successfully. Apr 21 13:58:43 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] [instance: 0346fbd8-64cd-45e7-906f-e00eeece91ce] Attempting to register defaults for the following image properties: ['hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 21 13:58:43 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 0346fbd8-64cd-45e7-906f-e00eeece91ce] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71474) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 13:58:43 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] [instance: 0346fbd8-64cd-45e7-906f-e00eeece91ce] Found default for hw_input_bus of None {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 13:58:43 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] [instance: 0346fbd8-64cd-45e7-906f-e00eeece91ce] Found default for hw_pointer_model of None {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 13:58:43 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] [instance: 0346fbd8-64cd-45e7-906f-e00eeece91ce] Found default for hw_video_model of virtio {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 13:58:43 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] [instance: 0346fbd8-64cd-45e7-906f-e00eeece91ce] Found default for hw_vif_model of virtio {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 13:58:43 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 0346fbd8-64cd-45e7-906f-e00eeece91ce] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 13:58:43 user nova-compute[71474]: DEBUG nova.virt.driver [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Emitting event Started> {{(pid=71474) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 13:58:43 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 0346fbd8-64cd-45e7-906f-e00eeece91ce] VM Started (Lifecycle Event) Apr 21 13:58:43 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 0346fbd8-64cd-45e7-906f-e00eeece91ce] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 13:58:43 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 0346fbd8-64cd-45e7-906f-e00eeece91ce] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71474) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 13:58:43 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 0346fbd8-64cd-45e7-906f-e00eeece91ce] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 13:58:43 user nova-compute[71474]: INFO nova.compute.manager [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] [instance: 0346fbd8-64cd-45e7-906f-e00eeece91ce] Took 10.91 seconds to spawn the instance on the hypervisor. Apr 21 13:58:43 user nova-compute[71474]: DEBUG nova.compute.manager [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] [instance: 0346fbd8-64cd-45e7-906f-e00eeece91ce] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 13:58:43 user nova-compute[71474]: INFO nova.compute.manager [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] [instance: 0346fbd8-64cd-45e7-906f-e00eeece91ce] Took 12.05 seconds to build instance. Apr 21 13:58:43 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-17eb390f-4076-448e-ae84-76f8d6c6c9fb tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] Lock "0346fbd8-64cd-45e7-906f-e00eeece91ce" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 12.188s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:58:43 user nova-compute[71474]: DEBUG nova.virt.driver [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Emitting event Resumed> {{(pid=71474) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 13:58:43 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] VM Resumed (Lifecycle Event) Apr 21 13:58:43 user nova-compute[71474]: DEBUG nova.compute.manager [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] Instance event wait completed in 0 seconds for {{(pid=71474) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 21 13:58:43 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] Guest created on hypervisor {{(pid=71474) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 21 13:58:43 user nova-compute[71474]: INFO nova.virt.libvirt.driver [-] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] Instance spawned successfully. Apr 21 13:58:43 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 21 13:58:43 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 13:58:43 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71474) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 13:58:43 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] Found default for hw_cdrom_bus of ide {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 13:58:43 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] Found default for hw_disk_bus of virtio {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 13:58:43 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] Found default for hw_input_bus of None {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 13:58:43 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] Found default for hw_pointer_model of None {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 13:58:43 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] Found default for hw_video_model of virtio {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 13:58:43 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] Found default for hw_vif_model of virtio {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 13:58:43 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 13:58:43 user nova-compute[71474]: DEBUG nova.virt.driver [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Emitting event Started> {{(pid=71474) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 13:58:43 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] VM Started (Lifecycle Event) Apr 21 13:58:43 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 13:58:43 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71474) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 13:58:43 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 13:58:43 user nova-compute[71474]: INFO nova.compute.manager [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] Took 8.54 seconds to spawn the instance on the hypervisor. Apr 21 13:58:43 user nova-compute[71474]: DEBUG nova.compute.manager [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 13:58:43 user nova-compute[71474]: INFO nova.compute.manager [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] Took 9.46 seconds to build instance. Apr 21 13:58:43 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f07cda0b-1001-48a0-a9d3-d31e01d07db7 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Lock "f0f32b68-6993-4843-bcc6-bd0e06377b27" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 9.629s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:58:44 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:44 user nova-compute[71474]: DEBUG nova.compute.manager [req-3ab59224-4381-43b7-bc14-257e81ef3db4 req-6f334cc9-a1ba-43e3-9d4f-a53d0ef620a8 service nova] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] Received event network-vif-plugged-20ca5a57-3cd5-47ad-bdfe-f56a0ecd078b {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 13:58:44 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-3ab59224-4381-43b7-bc14-257e81ef3db4 req-6f334cc9-a1ba-43e3-9d4f-a53d0ef620a8 service nova] Acquiring lock "f0f32b68-6993-4843-bcc6-bd0e06377b27-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:58:44 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-3ab59224-4381-43b7-bc14-257e81ef3db4 req-6f334cc9-a1ba-43e3-9d4f-a53d0ef620a8 service nova] Lock "f0f32b68-6993-4843-bcc6-bd0e06377b27-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:58:44 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-3ab59224-4381-43b7-bc14-257e81ef3db4 req-6f334cc9-a1ba-43e3-9d4f-a53d0ef620a8 service nova] Lock "f0f32b68-6993-4843-bcc6-bd0e06377b27-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:58:44 user nova-compute[71474]: DEBUG nova.compute.manager [req-3ab59224-4381-43b7-bc14-257e81ef3db4 req-6f334cc9-a1ba-43e3-9d4f-a53d0ef620a8 service nova] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] No waiting events found dispatching network-vif-plugged-20ca5a57-3cd5-47ad-bdfe-f56a0ecd078b {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 13:58:44 user nova-compute[71474]: WARNING nova.compute.manager [req-3ab59224-4381-43b7-bc14-257e81ef3db4 req-6f334cc9-a1ba-43e3-9d4f-a53d0ef620a8 service nova] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] Received unexpected event network-vif-plugged-20ca5a57-3cd5-47ad-bdfe-f56a0ecd078b for instance with vm_state active and task_state None. Apr 21 13:58:44 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:45 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:45 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:46 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:47 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 13:58:47 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Cleaning up deleted instances {{(pid=71474) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11079}} Apr 21 13:58:47 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] There are 0 instances to clean {{(pid=71474) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11088}} Apr 21 13:58:47 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 13:58:47 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Cleaning up deleted instances with incomplete migration {{(pid=71474) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11117}} Apr 21 13:58:47 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 13:58:49 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:49 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 13:58:49 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 13:58:50 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:50 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:50 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 13:58:51 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 13:58:51 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Starting heal instance info cache {{(pid=71474) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 21 13:58:51 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Rebuilding the list of instances to heal {{(pid=71474) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 21 13:58:51 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquiring lock "refresh_cache-5030decd-cbe5-4495-b497-dfacf25eef73" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 13:58:51 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquired lock "refresh_cache-5030decd-cbe5-4495-b497-dfacf25eef73" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 13:58:51 user nova-compute[71474]: DEBUG nova.network.neutron [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] Forcefully refreshing network info cache for instance {{(pid=71474) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 21 13:58:51 user nova-compute[71474]: DEBUG nova.objects.instance [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lazy-loading 'info_cache' on Instance uuid 5030decd-cbe5-4495-b497-dfacf25eef73 {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 13:58:53 user nova-compute[71474]: DEBUG nova.network.neutron [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] Updating instance_info_cache with network_info: [{"id": "ed62554b-cbc2-4c0f-ad1a-821a0625a2e4", "address": "fa:16:3e:e2:cc:bd", "network": {"id": "23a0f330-371d-4fe5-befe-bc4147bf09c7", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-656541543-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "91f5972380fd48eabffd46e6727239ce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "taped62554b-cb", "ovs_interfaceid": "ed62554b-cbc2-4c0f-ad1a-821a0625a2e4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 13:58:53 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:53 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Releasing lock "refresh_cache-5030decd-cbe5-4495-b497-dfacf25eef73" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 13:58:53 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] Updated the network info_cache for instance {{(pid=71474) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 21 13:58:53 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 13:58:53 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 13:58:53 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71474) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 21 13:58:53 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 13:58:53 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:58:53 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:58:53 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:58:53 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Auditing locally available compute resources for user (node: user) {{(pid=71474) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 21 13:58:53 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:53 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/3af27bc9-9617-44c7-bfa4-993b347d183c/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:58:53 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/3af27bc9-9617-44c7-bfa4-993b347d183c/disk --force-share --output=json" returned: 0 in 0.169s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:58:53 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/3af27bc9-9617-44c7-bfa4-993b347d183c/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:58:53 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/3af27bc9-9617-44c7-bfa4-993b347d183c/disk --force-share --output=json" returned: 0 in 0.146s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:58:53 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2c5afe45-87ae-477a-8bf0-6a5e2036fb68/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:58:54 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2c5afe45-87ae-477a-8bf0-6a5e2036fb68/disk --force-share --output=json" returned: 0 in 0.177s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:58:54 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2c5afe45-87ae-477a-8bf0-6a5e2036fb68/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:58:54 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2c5afe45-87ae-477a-8bf0-6a5e2036fb68/disk --force-share --output=json" returned: 0 in 0.146s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:58:54 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f0f32b68-6993-4843-bcc6-bd0e06377b27/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:58:54 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:54 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f0f32b68-6993-4843-bcc6-bd0e06377b27/disk --force-share --output=json" returned: 0 in 0.149s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:58:54 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f0f32b68-6993-4843-bcc6-bd0e06377b27/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:58:54 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f0f32b68-6993-4843-bcc6-bd0e06377b27/disk --force-share --output=json" returned: 0 in 0.146s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:58:54 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5030decd-cbe5-4495-b497-dfacf25eef73/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:58:54 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5030decd-cbe5-4495-b497-dfacf25eef73/disk --force-share --output=json" returned: 0 in 0.148s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:58:54 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5030decd-cbe5-4495-b497-dfacf25eef73/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:58:54 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5030decd-cbe5-4495-b497-dfacf25eef73/disk --force-share --output=json" returned: 0 in 0.170s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:58:55 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/0346fbd8-64cd-45e7-906f-e00eeece91ce/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:58:55 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/0346fbd8-64cd-45e7-906f-e00eeece91ce/disk --force-share --output=json" returned: 0 in 0.144s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:58:55 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/0346fbd8-64cd-45e7-906f-e00eeece91ce/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:58:55 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:55 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/0346fbd8-64cd-45e7-906f-e00eeece91ce/disk --force-share --output=json" returned: 0 in 0.210s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:58:55 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/30068c4a-94ed-4b84-9178-0d554326fc68/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:58:55 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/30068c4a-94ed-4b84-9178-0d554326fc68/disk --force-share --output=json" returned: 0 in 0.166s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:58:55 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/30068c4a-94ed-4b84-9178-0d554326fc68/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:58:55 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/30068c4a-94ed-4b84-9178-0d554326fc68/disk --force-share --output=json" returned: 0 in 0.177s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:58:55 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:56 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Acquiring lock "5e502c4c-a46b-4670-acba-2fda2d05adf5" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:58:56 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Lock "5e502c4c-a46b-4670-acba-2fda2d05adf5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:58:56 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 13:58:56 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 13:58:56 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Hypervisor/Node resource view: name=user free_ram=8137MB free_disk=26.195842742919922GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71474) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 21 13:58:56 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:58:56 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:58:56 user nova-compute[71474]: DEBUG nova.compute.manager [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] Starting instance... {{(pid=71474) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 21 13:58:56 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:58:56 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Instance 5030decd-cbe5-4495-b497-dfacf25eef73 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71474) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 13:58:56 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Instance 30068c4a-94ed-4b84-9178-0d554326fc68 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71474) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 13:58:56 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Instance 3af27bc9-9617-44c7-bfa4-993b347d183c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71474) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 13:58:56 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Instance 2c5afe45-87ae-477a-8bf0-6a5e2036fb68 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71474) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 13:58:56 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Instance 0346fbd8-64cd-45e7-906f-e00eeece91ce actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71474) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 13:58:56 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Instance f0f32b68-6993-4843-bcc6-bd0e06377b27 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71474) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 13:58:56 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Instance 5e502c4c-a46b-4670-acba-2fda2d05adf5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71474) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1689}} Apr 21 13:58:56 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Total usable vcpus: 12, total allocated vcpus: 6 {{(pid=71474) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 21 13:58:56 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Final resource view: name=user phys_ram=16023MB used_ram=1280MB phys_disk=40GB used_disk=6GB total_vcpus=12 used_vcpus=6 pci_stats=[] {{(pid=71474) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 21 13:58:56 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Inventory has not changed in ProviderTree for provider: 4e62c1ab-67bb-43ed-8389-61deb50e98d7 {{(pid=71474) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 13:58:56 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Inventory has not changed for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 13:58:56 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Compute_service record updated for user:user {{(pid=71474) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 21 13:58:56 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.453s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:58:56 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.336s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:58:56 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71474) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 21 13:58:56 user nova-compute[71474]: INFO nova.compute.claims [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] Claim successful on node user Apr 21 13:58:56 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Refreshing inventories for resource provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 {{(pid=71474) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} Apr 21 13:58:56 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Updating ProviderTree inventory for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 from _refresh_and_get_inventory using data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} Apr 21 13:58:56 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Updating inventory in ProviderTree for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 with inventory: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} Apr 21 13:58:56 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Refreshing aggregate associations for resource provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7, aggregates: None {{(pid=71474) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} Apr 21 13:58:56 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Refreshing trait associations for resource provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7, traits: COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_GRAPHICS_MODEL_VMVGA,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE2,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_MMX,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_QXL,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSSE3,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE42,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_USB,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_BOCHS {{(pid=71474) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} Apr 21 13:58:57 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 13:58:57 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 13:58:57 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Inventory has not changed in ProviderTree for provider: 4e62c1ab-67bb-43ed-8389-61deb50e98d7 {{(pid=71474) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 13:58:57 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Inventory has not changed for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 13:58:57 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.754s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:58:57 user nova-compute[71474]: DEBUG nova.compute.manager [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] Start building networks asynchronously for instance. {{(pid=71474) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 21 13:58:57 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:57 user nova-compute[71474]: DEBUG nova.compute.manager [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] Allocating IP information in the background. {{(pid=71474) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 21 13:58:57 user nova-compute[71474]: DEBUG nova.network.neutron [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] allocate_for_instance() {{(pid=71474) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 21 13:58:57 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 21 13:58:57 user nova-compute[71474]: DEBUG nova.compute.manager [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] Start building block device mappings for instance. {{(pid=71474) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 21 13:58:57 user nova-compute[71474]: DEBUG nova.compute.manager [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] Start spawning the instance on the hypervisor. {{(pid=71474) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 21 13:58:57 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] Creating instance directory {{(pid=71474) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 21 13:58:57 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] Creating image(s) Apr 21 13:58:57 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Acquiring lock "/opt/stack/data/nova/instances/5e502c4c-a46b-4670-acba-2fda2d05adf5/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:58:57 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Lock "/opt/stack/data/nova/instances/5e502c4c-a46b-4670-acba-2fda2d05adf5/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:58:57 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Lock "/opt/stack/data/nova/instances/5e502c4c-a46b-4670-acba-2fda2d05adf5/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:58:57 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:58:57 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json" returned: 0 in 0.133s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:58:57 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Acquiring lock "8e8c288cb98f22f6af31ad55f38b7baa81c260d7" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:58:57 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Lock "8e8c288cb98f22f6af31ad55f38b7baa81c260d7" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:58:57 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:58:57 user nova-compute[71474]: DEBUG nova.policy [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '132913991f8c45c1adaf5db7ef7cea30', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '885cdc1521a14985bfa70ae21e73c693', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71474) authorize /opt/stack/nova/nova/policy.py:203}} Apr 21 13:58:58 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json" returned: 0 in 0.134s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:58:58 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7,backing_fmt=raw /opt/stack/data/nova/instances/5e502c4c-a46b-4670-acba-2fda2d05adf5/disk 1073741824 {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:58:58 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7,backing_fmt=raw /opt/stack/data/nova/instances/5e502c4c-a46b-4670-acba-2fda2d05adf5/disk 1073741824" returned: 0 in 0.047s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:58:58 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Lock "8e8c288cb98f22f6af31ad55f38b7baa81c260d7" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.185s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:58:58 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:58:58 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json" returned: 0 in 0.146s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:58:58 user nova-compute[71474]: DEBUG nova.virt.disk.api [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Checking if we can resize image /opt/stack/data/nova/instances/5e502c4c-a46b-4670-acba-2fda2d05adf5/disk. size=1073741824 {{(pid=71474) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 21 13:58:58 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5e502c4c-a46b-4670-acba-2fda2d05adf5/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:58:58 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5e502c4c-a46b-4670-acba-2fda2d05adf5/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:58:58 user nova-compute[71474]: DEBUG nova.virt.disk.api [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Cannot resize image /opt/stack/data/nova/instances/5e502c4c-a46b-4670-acba-2fda2d05adf5/disk to a smaller size. {{(pid=71474) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 21 13:58:58 user nova-compute[71474]: DEBUG nova.objects.instance [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Lazy-loading 'migration_context' on Instance uuid 5e502c4c-a46b-4670-acba-2fda2d05adf5 {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 13:58:58 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] Created local disks {{(pid=71474) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 21 13:58:58 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] Ensure instance console log exists: /opt/stack/data/nova/instances/5e502c4c-a46b-4670-acba-2fda2d05adf5/console.log {{(pid=71474) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 21 13:58:58 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:58:58 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:58:58 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:58:59 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:59 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:58:59 user nova-compute[71474]: DEBUG nova.network.neutron [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] Successfully created port: 9ba354a7-6fb2-4eb1-96f4-edb58950895e {{(pid=71474) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 21 13:59:00 user nova-compute[71474]: DEBUG nova.network.neutron [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] Successfully updated port: 9ba354a7-6fb2-4eb1-96f4-edb58950895e {{(pid=71474) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 21 13:59:00 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Acquiring lock "refresh_cache-5e502c4c-a46b-4670-acba-2fda2d05adf5" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 13:59:00 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Acquired lock "refresh_cache-5e502c4c-a46b-4670-acba-2fda2d05adf5" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 13:59:00 user nova-compute[71474]: DEBUG nova.network.neutron [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] Building network info cache for instance {{(pid=71474) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 21 13:59:00 user nova-compute[71474]: DEBUG nova.compute.manager [req-567602ec-fc22-4868-8d78-2aa2edca2bc6 req-d4761816-a121-4666-9059-511e1db521e7 service nova] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] Received event network-changed-9ba354a7-6fb2-4eb1-96f4-edb58950895e {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 13:59:00 user nova-compute[71474]: DEBUG nova.compute.manager [req-567602ec-fc22-4868-8d78-2aa2edca2bc6 req-d4761816-a121-4666-9059-511e1db521e7 service nova] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] Refreshing instance network info cache due to event network-changed-9ba354a7-6fb2-4eb1-96f4-edb58950895e. {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 21 13:59:00 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-567602ec-fc22-4868-8d78-2aa2edca2bc6 req-d4761816-a121-4666-9059-511e1db521e7 service nova] Acquiring lock "refresh_cache-5e502c4c-a46b-4670-acba-2fda2d05adf5" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 13:59:00 user nova-compute[71474]: DEBUG nova.network.neutron [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] Instance cache missing network info. {{(pid=71474) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 21 13:59:00 user nova-compute[71474]: DEBUG nova.network.neutron [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] Updating instance_info_cache with network_info: [{"id": "9ba354a7-6fb2-4eb1-96f4-edb58950895e", "address": "fa:16:3e:2a:5f:60", "network": {"id": "4b38afb7-2b53-44fc-a4e0-7d79bef71734", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-935140606-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "885cdc1521a14985bfa70ae21e73c693", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ba354a7-6f", "ovs_interfaceid": "9ba354a7-6fb2-4eb1-96f4-edb58950895e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 13:59:00 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Releasing lock "refresh_cache-5e502c4c-a46b-4670-acba-2fda2d05adf5" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 13:59:00 user nova-compute[71474]: DEBUG nova.compute.manager [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] Instance network_info: |[{"id": "9ba354a7-6fb2-4eb1-96f4-edb58950895e", "address": "fa:16:3e:2a:5f:60", "network": {"id": "4b38afb7-2b53-44fc-a4e0-7d79bef71734", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-935140606-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "885cdc1521a14985bfa70ae21e73c693", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ba354a7-6f", "ovs_interfaceid": "9ba354a7-6fb2-4eb1-96f4-edb58950895e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71474) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 21 13:59:00 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-567602ec-fc22-4868-8d78-2aa2edca2bc6 req-d4761816-a121-4666-9059-511e1db521e7 service nova] Acquired lock "refresh_cache-5e502c4c-a46b-4670-acba-2fda2d05adf5" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 13:59:00 user nova-compute[71474]: DEBUG nova.network.neutron [req-567602ec-fc22-4868-8d78-2aa2edca2bc6 req-d4761816-a121-4666-9059-511e1db521e7 service nova] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] Refreshing network info cache for port 9ba354a7-6fb2-4eb1-96f4-edb58950895e {{(pid=71474) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 21 13:59:00 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] Start _get_guest_xml network_info=[{"id": "9ba354a7-6fb2-4eb1-96f4-edb58950895e", "address": "fa:16:3e:2a:5f:60", "network": {"id": "4b38afb7-2b53-44fc-a4e0-7d79bef71734", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-935140606-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "885cdc1521a14985bfa70ae21e73c693", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ba354a7-6f", "ovs_interfaceid": "9ba354a7-6fb2-4eb1-96f4-edb58950895e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T13:54:16Z,direct_url=,disk_format='qcow2',id=2edfef44-2867-4e03-a53e-b139f99afa75,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='36a44032fda748c1965c722304fa176d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T13:54:18Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'guest_format': None, 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'image_id': '2edfef44-2867-4e03-a53e-b139f99afa75'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71474) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 21 13:59:00 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:59:00 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 13:59:00 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 13:59:00 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71474) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 21 13:59:00 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-21T13:55:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T13:54:16Z,direct_url=,disk_format='qcow2',id=2edfef44-2867-4e03-a53e-b139f99afa75,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='36a44032fda748c1965c722304fa176d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T13:54:18Z,virtual_size=,visibility=), allow threads: True {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 21 13:59:00 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Flavor limits 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 21 13:59:00 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Image limits 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 21 13:59:00 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Flavor pref 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 21 13:59:00 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Image pref 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 21 13:59:00 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 21 13:59:00 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 21 13:59:00 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71474) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 21 13:59:00 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Got 1 possible topologies {{(pid=71474) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 21 13:59:00 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 21 13:59:00 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 21 13:59:00 user nova-compute[71474]: DEBUG nova.virt.libvirt.vif [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T13:58:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-1731146767',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverbootfromvolumestablerescuetest-server-1731146767',id=7,image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='885cdc1521a14985bfa70ae21e73c693',ramdisk_id='',reservation_id='r-swvv899m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-28514522',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T13:58:58Z,user_data=None,user_id='132913991f8c45c1adaf5db7ef7cea30',uuid=5e502c4c-a46b-4670-acba-2fda2d05adf5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9ba354a7-6fb2-4eb1-96f4-edb58950895e", "address": "fa:16:3e:2a:5f:60", "network": {"id": "4b38afb7-2b53-44fc-a4e0-7d79bef71734", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-935140606-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "885cdc1521a14985bfa70ae21e73c693", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ba354a7-6f", "ovs_interfaceid": "9ba354a7-6fb2-4eb1-96f4-edb58950895e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71474) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 21 13:59:00 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Converting VIF {"id": "9ba354a7-6fb2-4eb1-96f4-edb58950895e", "address": "fa:16:3e:2a:5f:60", "network": {"id": "4b38afb7-2b53-44fc-a4e0-7d79bef71734", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-935140606-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "885cdc1521a14985bfa70ae21e73c693", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ba354a7-6f", "ovs_interfaceid": "9ba354a7-6fb2-4eb1-96f4-edb58950895e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 13:59:00 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2a:5f:60,bridge_name='br-int',has_traffic_filtering=True,id=9ba354a7-6fb2-4eb1-96f4-edb58950895e,network=Network(4b38afb7-2b53-44fc-a4e0-7d79bef71734),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9ba354a7-6f') {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 13:59:00 user nova-compute[71474]: DEBUG nova.objects.instance [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Lazy-loading 'pci_devices' on Instance uuid 5e502c4c-a46b-4670-acba-2fda2d05adf5 {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 13:59:00 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] End _get_guest_xml xml= Apr 21 13:59:00 user nova-compute[71474]: 5e502c4c-a46b-4670-acba-2fda2d05adf5 Apr 21 13:59:00 user nova-compute[71474]: instance-00000007 Apr 21 13:59:00 user nova-compute[71474]: 131072 Apr 21 13:59:00 user nova-compute[71474]: 1 Apr 21 13:59:00 user nova-compute[71474]: Apr 21 13:59:00 user nova-compute[71474]: Apr 21 13:59:00 user nova-compute[71474]: Apr 21 13:59:00 user nova-compute[71474]: tempest-ServerBootFromVolumeStableRescueTest-server-1731146767 Apr 21 13:59:00 user nova-compute[71474]: 2023-04-21 13:59:00 Apr 21 13:59:00 user nova-compute[71474]: Apr 21 13:59:00 user nova-compute[71474]: 128 Apr 21 13:59:00 user nova-compute[71474]: 1 Apr 21 13:59:00 user nova-compute[71474]: 0 Apr 21 13:59:00 user nova-compute[71474]: 0 Apr 21 13:59:00 user nova-compute[71474]: 1 Apr 21 13:59:00 user nova-compute[71474]: Apr 21 13:59:00 user nova-compute[71474]: Apr 21 13:59:00 user nova-compute[71474]: tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member Apr 21 13:59:00 user nova-compute[71474]: tempest-ServerBootFromVolumeStableRescueTest-28514522 Apr 21 13:59:00 user nova-compute[71474]: Apr 21 13:59:00 user nova-compute[71474]: Apr 21 13:59:00 user nova-compute[71474]: Apr 21 13:59:00 user nova-compute[71474]: Apr 21 13:59:00 user nova-compute[71474]: Apr 21 13:59:00 user nova-compute[71474]: Apr 21 13:59:00 user nova-compute[71474]: Apr 21 13:59:00 user nova-compute[71474]: Apr 21 13:59:00 user nova-compute[71474]: Apr 21 13:59:00 user nova-compute[71474]: Apr 21 13:59:00 user nova-compute[71474]: Apr 21 13:59:00 user nova-compute[71474]: OpenStack Foundation Apr 21 13:59:00 user nova-compute[71474]: OpenStack Nova Apr 21 13:59:00 user nova-compute[71474]: 0.0.0 Apr 21 13:59:00 user nova-compute[71474]: 5e502c4c-a46b-4670-acba-2fda2d05adf5 Apr 21 13:59:00 user nova-compute[71474]: 5e502c4c-a46b-4670-acba-2fda2d05adf5 Apr 21 13:59:00 user nova-compute[71474]: Virtual Machine Apr 21 13:59:00 user nova-compute[71474]: Apr 21 13:59:00 user nova-compute[71474]: Apr 21 13:59:00 user nova-compute[71474]: Apr 21 13:59:00 user nova-compute[71474]: hvm Apr 21 13:59:00 user nova-compute[71474]: Apr 21 13:59:00 user nova-compute[71474]: Apr 21 13:59:00 user nova-compute[71474]: Apr 21 13:59:00 user nova-compute[71474]: Apr 21 13:59:00 user nova-compute[71474]: Apr 21 13:59:00 user nova-compute[71474]: Apr 21 13:59:00 user nova-compute[71474]: Apr 21 13:59:00 user nova-compute[71474]: Apr 21 13:59:00 user nova-compute[71474]: Apr 21 13:59:00 user nova-compute[71474]: Apr 21 13:59:00 user nova-compute[71474]: Apr 21 13:59:00 user nova-compute[71474]: Apr 21 13:59:00 user nova-compute[71474]: Apr 21 13:59:00 user nova-compute[71474]: Apr 21 13:59:00 user nova-compute[71474]: Nehalem Apr 21 13:59:00 user nova-compute[71474]: Apr 21 13:59:00 user nova-compute[71474]: Apr 21 13:59:00 user nova-compute[71474]: Apr 21 13:59:00 user nova-compute[71474]: Apr 21 13:59:00 user nova-compute[71474]: Apr 21 13:59:00 user nova-compute[71474]: Apr 21 13:59:00 user nova-compute[71474]: Apr 21 13:59:00 user nova-compute[71474]: Apr 21 13:59:00 user nova-compute[71474]: Apr 21 13:59:00 user nova-compute[71474]: Apr 21 13:59:00 user nova-compute[71474]: Apr 21 13:59:00 user nova-compute[71474]: Apr 21 13:59:00 user nova-compute[71474]: Apr 21 13:59:00 user nova-compute[71474]: Apr 21 13:59:00 user nova-compute[71474]: Apr 21 13:59:00 user nova-compute[71474]: Apr 21 13:59:00 user nova-compute[71474]: Apr 21 13:59:00 user nova-compute[71474]: Apr 21 13:59:00 user nova-compute[71474]: Apr 21 13:59:00 user nova-compute[71474]: Apr 21 13:59:00 user nova-compute[71474]: /dev/urandom Apr 21 13:59:00 user nova-compute[71474]: Apr 21 13:59:00 user nova-compute[71474]: Apr 21 13:59:00 user nova-compute[71474]: Apr 21 13:59:00 user nova-compute[71474]: Apr 21 13:59:00 user nova-compute[71474]: Apr 21 13:59:00 user nova-compute[71474]: Apr 21 13:59:00 user nova-compute[71474]: Apr 21 13:59:00 user nova-compute[71474]: {{(pid=71474) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 21 13:59:00 user nova-compute[71474]: DEBUG nova.virt.libvirt.vif [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T13:58:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-1731146767',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverbootfromvolumestablerescuetest-server-1731146767',id=7,image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='885cdc1521a14985bfa70ae21e73c693',ramdisk_id='',reservation_id='r-swvv899m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-28514522',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T13:58:58Z,user_data=None,user_id='132913991f8c45c1adaf5db7ef7cea30',uuid=5e502c4c-a46b-4670-acba-2fda2d05adf5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9ba354a7-6fb2-4eb1-96f4-edb58950895e", "address": "fa:16:3e:2a:5f:60", "network": {"id": "4b38afb7-2b53-44fc-a4e0-7d79bef71734", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-935140606-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "885cdc1521a14985bfa70ae21e73c693", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ba354a7-6f", "ovs_interfaceid": "9ba354a7-6fb2-4eb1-96f4-edb58950895e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 21 13:59:00 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Converting VIF {"id": "9ba354a7-6fb2-4eb1-96f4-edb58950895e", "address": "fa:16:3e:2a:5f:60", "network": {"id": "4b38afb7-2b53-44fc-a4e0-7d79bef71734", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-935140606-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "885cdc1521a14985bfa70ae21e73c693", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ba354a7-6f", "ovs_interfaceid": "9ba354a7-6fb2-4eb1-96f4-edb58950895e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 13:59:00 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2a:5f:60,bridge_name='br-int',has_traffic_filtering=True,id=9ba354a7-6fb2-4eb1-96f4-edb58950895e,network=Network(4b38afb7-2b53-44fc-a4e0-7d79bef71734),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9ba354a7-6f') {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 13:59:00 user nova-compute[71474]: DEBUG os_vif [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2a:5f:60,bridge_name='br-int',has_traffic_filtering=True,id=9ba354a7-6fb2-4eb1-96f4-edb58950895e,network=Network(4b38afb7-2b53-44fc-a4e0-7d79bef71734),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9ba354a7-6f') {{(pid=71474) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 21 13:59:00 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] Acquiring lock "90591d9b-6d6b-4f22-a3dc-fd83044df26b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:59:00 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] Lock "90591d9b-6d6b-4f22-a3dc-fd83044df26b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:59:00 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:59:00 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 13:59:00 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 21 13:59:00 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:59:00 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9ba354a7-6f, may_exist=True) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 13:59:00 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9ba354a7-6f, col_values=(('external_ids', {'iface-id': '9ba354a7-6fb2-4eb1-96f4-edb58950895e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2a:5f:60', 'vm-uuid': '5e502c4c-a46b-4670-acba-2fda2d05adf5'}),)) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 13:59:00 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:59:00 user nova-compute[71474]: DEBUG nova.compute.manager [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] [instance: 90591d9b-6d6b-4f22-a3dc-fd83044df26b] Starting instance... {{(pid=71474) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 21 13:59:00 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 13:59:00 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:59:00 user nova-compute[71474]: INFO os_vif [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2a:5f:60,bridge_name='br-int',has_traffic_filtering=True,id=9ba354a7-6fb2-4eb1-96f4-edb58950895e,network=Network(4b38afb7-2b53-44fc-a4e0-7d79bef71734),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9ba354a7-6f') Apr 21 13:59:00 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] No BDM found with device name vda, not building metadata. {{(pid=71474) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 21 13:59:00 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] No VIF found with MAC fa:16:3e:2a:5f:60, not building metadata {{(pid=71474) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 21 13:59:00 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:59:00 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:59:00 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71474) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 21 13:59:00 user nova-compute[71474]: INFO nova.compute.claims [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] [instance: 90591d9b-6d6b-4f22-a3dc-fd83044df26b] Claim successful on node user Apr 21 13:59:01 user nova-compute[71474]: DEBUG nova.network.neutron [req-567602ec-fc22-4868-8d78-2aa2edca2bc6 req-d4761816-a121-4666-9059-511e1db521e7 service nova] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] Updated VIF entry in instance network info cache for port 9ba354a7-6fb2-4eb1-96f4-edb58950895e. {{(pid=71474) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 21 13:59:01 user nova-compute[71474]: DEBUG nova.network.neutron [req-567602ec-fc22-4868-8d78-2aa2edca2bc6 req-d4761816-a121-4666-9059-511e1db521e7 service nova] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] Updating instance_info_cache with network_info: [{"id": "9ba354a7-6fb2-4eb1-96f4-edb58950895e", "address": "fa:16:3e:2a:5f:60", "network": {"id": "4b38afb7-2b53-44fc-a4e0-7d79bef71734", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-935140606-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "885cdc1521a14985bfa70ae21e73c693", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ba354a7-6f", "ovs_interfaceid": "9ba354a7-6fb2-4eb1-96f4-edb58950895e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 13:59:01 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] Inventory has not changed in ProviderTree for provider: 4e62c1ab-67bb-43ed-8389-61deb50e98d7 {{(pid=71474) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 13:59:01 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-567602ec-fc22-4868-8d78-2aa2edca2bc6 req-d4761816-a121-4666-9059-511e1db521e7 service nova] Releasing lock "refresh_cache-5e502c4c-a46b-4670-acba-2fda2d05adf5" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 13:59:01 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] Inventory has not changed for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 13:59:01 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.355s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:59:01 user nova-compute[71474]: DEBUG nova.compute.manager [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] [instance: 90591d9b-6d6b-4f22-a3dc-fd83044df26b] Start building networks asynchronously for instance. {{(pid=71474) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 21 13:59:01 user nova-compute[71474]: DEBUG nova.compute.manager [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] [instance: 90591d9b-6d6b-4f22-a3dc-fd83044df26b] Allocating IP information in the background. {{(pid=71474) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 21 13:59:01 user nova-compute[71474]: DEBUG nova.network.neutron [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] [instance: 90591d9b-6d6b-4f22-a3dc-fd83044df26b] allocate_for_instance() {{(pid=71474) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 21 13:59:01 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] [instance: 90591d9b-6d6b-4f22-a3dc-fd83044df26b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 21 13:59:01 user nova-compute[71474]: DEBUG nova.compute.manager [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] [instance: 90591d9b-6d6b-4f22-a3dc-fd83044df26b] Start building block device mappings for instance. {{(pid=71474) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 21 13:59:01 user nova-compute[71474]: DEBUG nova.policy [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f09a36fbec134e248aa2baec4bb6a53b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f4add6cbbd424513a25dacd5dfb3adcf', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71474) authorize /opt/stack/nova/nova/policy.py:203}} Apr 21 13:59:01 user nova-compute[71474]: DEBUG nova.compute.manager [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] [instance: 90591d9b-6d6b-4f22-a3dc-fd83044df26b] Start spawning the instance on the hypervisor. {{(pid=71474) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 21 13:59:01 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] [instance: 90591d9b-6d6b-4f22-a3dc-fd83044df26b] Creating instance directory {{(pid=71474) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 21 13:59:01 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] [instance: 90591d9b-6d6b-4f22-a3dc-fd83044df26b] Creating image(s) Apr 21 13:59:01 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] Acquiring lock "/opt/stack/data/nova/instances/90591d9b-6d6b-4f22-a3dc-fd83044df26b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:59:01 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] Lock "/opt/stack/data/nova/instances/90591d9b-6d6b-4f22-a3dc-fd83044df26b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:59:01 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] Lock "/opt/stack/data/nova/instances/90591d9b-6d6b-4f22-a3dc-fd83044df26b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:59:01 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:59:01 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json" returned: 0 in 0.134s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:59:01 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] Acquiring lock "8e8c288cb98f22f6af31ad55f38b7baa81c260d7" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:59:01 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] Lock "8e8c288cb98f22f6af31ad55f38b7baa81c260d7" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:59:01 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:59:01 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json" returned: 0 in 0.127s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:59:01 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7,backing_fmt=raw /opt/stack/data/nova/instances/90591d9b-6d6b-4f22-a3dc-fd83044df26b/disk 1073741824 {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:59:01 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7,backing_fmt=raw /opt/stack/data/nova/instances/90591d9b-6d6b-4f22-a3dc-fd83044df26b/disk 1073741824" returned: 0 in 0.066s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:59:01 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] Lock "8e8c288cb98f22f6af31ad55f38b7baa81c260d7" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.200s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:59:01 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:59:02 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json" returned: 0 in 0.135s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:59:02 user nova-compute[71474]: DEBUG nova.virt.disk.api [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] Checking if we can resize image /opt/stack/data/nova/instances/90591d9b-6d6b-4f22-a3dc-fd83044df26b/disk. size=1073741824 {{(pid=71474) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 21 13:59:02 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/90591d9b-6d6b-4f22-a3dc-fd83044df26b/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:59:02 user nova-compute[71474]: DEBUG nova.network.neutron [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] [instance: 90591d9b-6d6b-4f22-a3dc-fd83044df26b] Successfully created port: 806c5236-f27a-4616-9ec6-85a2585f753a {{(pid=71474) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 21 13:59:02 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/90591d9b-6d6b-4f22-a3dc-fd83044df26b/disk --force-share --output=json" returned: 0 in 0.139s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:59:02 user nova-compute[71474]: DEBUG nova.virt.disk.api [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] Cannot resize image /opt/stack/data/nova/instances/90591d9b-6d6b-4f22-a3dc-fd83044df26b/disk to a smaller size. {{(pid=71474) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 21 13:59:02 user nova-compute[71474]: DEBUG nova.objects.instance [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] Lazy-loading 'migration_context' on Instance uuid 90591d9b-6d6b-4f22-a3dc-fd83044df26b {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 13:59:02 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] [instance: 90591d9b-6d6b-4f22-a3dc-fd83044df26b] Created local disks {{(pid=71474) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 21 13:59:02 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] [instance: 90591d9b-6d6b-4f22-a3dc-fd83044df26b] Ensure instance console log exists: /opt/stack/data/nova/instances/90591d9b-6d6b-4f22-a3dc-fd83044df26b/console.log {{(pid=71474) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 21 13:59:02 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:59:02 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:59:02 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:59:02 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:59:02 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:59:02 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:59:02 user nova-compute[71474]: DEBUG nova.compute.manager [req-6a68e961-30e1-45ff-9be3-b3e77b40c2d5 req-22bdd158-63fe-4ee6-aaf4-9efce6808d02 service nova] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] Received event network-vif-plugged-9ba354a7-6fb2-4eb1-96f4-edb58950895e {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 13:59:02 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-6a68e961-30e1-45ff-9be3-b3e77b40c2d5 req-22bdd158-63fe-4ee6-aaf4-9efce6808d02 service nova] Acquiring lock "5e502c4c-a46b-4670-acba-2fda2d05adf5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:59:02 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-6a68e961-30e1-45ff-9be3-b3e77b40c2d5 req-22bdd158-63fe-4ee6-aaf4-9efce6808d02 service nova] Lock "5e502c4c-a46b-4670-acba-2fda2d05adf5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:59:02 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-6a68e961-30e1-45ff-9be3-b3e77b40c2d5 req-22bdd158-63fe-4ee6-aaf4-9efce6808d02 service nova] Lock "5e502c4c-a46b-4670-acba-2fda2d05adf5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:59:02 user nova-compute[71474]: DEBUG nova.compute.manager [req-6a68e961-30e1-45ff-9be3-b3e77b40c2d5 req-22bdd158-63fe-4ee6-aaf4-9efce6808d02 service nova] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] No waiting events found dispatching network-vif-plugged-9ba354a7-6fb2-4eb1-96f4-edb58950895e {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 13:59:02 user nova-compute[71474]: WARNING nova.compute.manager [req-6a68e961-30e1-45ff-9be3-b3e77b40c2d5 req-22bdd158-63fe-4ee6-aaf4-9efce6808d02 service nova] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] Received unexpected event network-vif-plugged-9ba354a7-6fb2-4eb1-96f4-edb58950895e for instance with vm_state building and task_state spawning. Apr 21 13:59:02 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:59:02 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:59:03 user nova-compute[71474]: DEBUG nova.network.neutron [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] [instance: 90591d9b-6d6b-4f22-a3dc-fd83044df26b] Successfully updated port: 806c5236-f27a-4616-9ec6-85a2585f753a {{(pid=71474) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 21 13:59:03 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] Acquiring lock "refresh_cache-90591d9b-6d6b-4f22-a3dc-fd83044df26b" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 13:59:03 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] Acquired lock "refresh_cache-90591d9b-6d6b-4f22-a3dc-fd83044df26b" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 13:59:03 user nova-compute[71474]: DEBUG nova.network.neutron [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] [instance: 90591d9b-6d6b-4f22-a3dc-fd83044df26b] Building network info cache for instance {{(pid=71474) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 21 13:59:03 user nova-compute[71474]: DEBUG nova.compute.manager [req-e0f6489f-531b-487d-8bec-ae54425a1ac1 req-8b1e8319-8def-4180-8bd3-a808e280cf1a service nova] [instance: 90591d9b-6d6b-4f22-a3dc-fd83044df26b] Received event network-changed-806c5236-f27a-4616-9ec6-85a2585f753a {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 13:59:03 user nova-compute[71474]: DEBUG nova.compute.manager [req-e0f6489f-531b-487d-8bec-ae54425a1ac1 req-8b1e8319-8def-4180-8bd3-a808e280cf1a service nova] [instance: 90591d9b-6d6b-4f22-a3dc-fd83044df26b] Refreshing instance network info cache due to event network-changed-806c5236-f27a-4616-9ec6-85a2585f753a. {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 21 13:59:03 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-e0f6489f-531b-487d-8bec-ae54425a1ac1 req-8b1e8319-8def-4180-8bd3-a808e280cf1a service nova] Acquiring lock "refresh_cache-90591d9b-6d6b-4f22-a3dc-fd83044df26b" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 13:59:03 user nova-compute[71474]: DEBUG nova.network.neutron [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] [instance: 90591d9b-6d6b-4f22-a3dc-fd83044df26b] Instance cache missing network info. {{(pid=71474) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 21 13:59:03 user nova-compute[71474]: DEBUG nova.network.neutron [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] [instance: 90591d9b-6d6b-4f22-a3dc-fd83044df26b] Updating instance_info_cache with network_info: [{"id": "806c5236-f27a-4616-9ec6-85a2585f753a", "address": "fa:16:3e:0e:86:45", "network": {"id": "d4871732-c28b-456f-8efe-8f0c46a4107d", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-86130370-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f4add6cbbd424513a25dacd5dfb3adcf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap806c5236-f2", "ovs_interfaceid": "806c5236-f27a-4616-9ec6-85a2585f753a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 13:59:03 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] Releasing lock "refresh_cache-90591d9b-6d6b-4f22-a3dc-fd83044df26b" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 13:59:03 user nova-compute[71474]: DEBUG nova.compute.manager [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] [instance: 90591d9b-6d6b-4f22-a3dc-fd83044df26b] Instance network_info: |[{"id": "806c5236-f27a-4616-9ec6-85a2585f753a", "address": "fa:16:3e:0e:86:45", "network": {"id": "d4871732-c28b-456f-8efe-8f0c46a4107d", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-86130370-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f4add6cbbd424513a25dacd5dfb3adcf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap806c5236-f2", "ovs_interfaceid": "806c5236-f27a-4616-9ec6-85a2585f753a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71474) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 21 13:59:03 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-e0f6489f-531b-487d-8bec-ae54425a1ac1 req-8b1e8319-8def-4180-8bd3-a808e280cf1a service nova] Acquired lock "refresh_cache-90591d9b-6d6b-4f22-a3dc-fd83044df26b" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 13:59:03 user nova-compute[71474]: DEBUG nova.network.neutron [req-e0f6489f-531b-487d-8bec-ae54425a1ac1 req-8b1e8319-8def-4180-8bd3-a808e280cf1a service nova] [instance: 90591d9b-6d6b-4f22-a3dc-fd83044df26b] Refreshing network info cache for port 806c5236-f27a-4616-9ec6-85a2585f753a {{(pid=71474) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 21 13:59:03 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] [instance: 90591d9b-6d6b-4f22-a3dc-fd83044df26b] Start _get_guest_xml network_info=[{"id": "806c5236-f27a-4616-9ec6-85a2585f753a", "address": "fa:16:3e:0e:86:45", "network": {"id": "d4871732-c28b-456f-8efe-8f0c46a4107d", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-86130370-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f4add6cbbd424513a25dacd5dfb3adcf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap806c5236-f2", "ovs_interfaceid": "806c5236-f27a-4616-9ec6-85a2585f753a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T13:54:16Z,direct_url=,disk_format='qcow2',id=2edfef44-2867-4e03-a53e-b139f99afa75,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='36a44032fda748c1965c722304fa176d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T13:54:18Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'guest_format': None, 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'image_id': '2edfef44-2867-4e03-a53e-b139f99afa75'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71474) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 21 13:59:03 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 13:59:03 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 13:59:03 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71474) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 21 13:59:03 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-21T13:55:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T13:54:16Z,direct_url=,disk_format='qcow2',id=2edfef44-2867-4e03-a53e-b139f99afa75,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='36a44032fda748c1965c722304fa176d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T13:54:18Z,virtual_size=,visibility=), allow threads: True {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 21 13:59:03 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] Flavor limits 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 21 13:59:03 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] Image limits 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 21 13:59:03 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] Flavor pref 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 21 13:59:03 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] Image pref 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 21 13:59:03 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 21 13:59:03 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 21 13:59:03 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71474) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 21 13:59:03 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] Got 1 possible topologies {{(pid=71474) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 21 13:59:03 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 21 13:59:03 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 21 13:59:03 user nova-compute[71474]: DEBUG nova.virt.libvirt.vif [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T13:59:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1763512323',display_name='tempest-DeleteServersTestJSON-server-1763512323',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-deleteserverstestjson-server-1763512323',id=8,image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f4add6cbbd424513a25dacd5dfb3adcf',ramdisk_id='',reservation_id='r-hbsf4k4y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-356048122',owner_user_name='tempest-DeleteServersTestJSON-356048122-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T13:59:01Z,user_data=None,user_id='f09a36fbec134e248aa2baec4bb6a53b',uuid=90591d9b-6d6b-4f22-a3dc-fd83044df26b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "806c5236-f27a-4616-9ec6-85a2585f753a", "address": "fa:16:3e:0e:86:45", "network": {"id": "d4871732-c28b-456f-8efe-8f0c46a4107d", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-86130370-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f4add6cbbd424513a25dacd5dfb3adcf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap806c5236-f2", "ovs_interfaceid": "806c5236-f27a-4616-9ec6-85a2585f753a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71474) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 21 13:59:03 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] Converting VIF {"id": "806c5236-f27a-4616-9ec6-85a2585f753a", "address": "fa:16:3e:0e:86:45", "network": {"id": "d4871732-c28b-456f-8efe-8f0c46a4107d", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-86130370-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f4add6cbbd424513a25dacd5dfb3adcf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap806c5236-f2", "ovs_interfaceid": "806c5236-f27a-4616-9ec6-85a2585f753a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 13:59:03 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0e:86:45,bridge_name='br-int',has_traffic_filtering=True,id=806c5236-f27a-4616-9ec6-85a2585f753a,network=Network(d4871732-c28b-456f-8efe-8f0c46a4107d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap806c5236-f2') {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 13:59:03 user nova-compute[71474]: DEBUG nova.objects.instance [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] Lazy-loading 'pci_devices' on Instance uuid 90591d9b-6d6b-4f22-a3dc-fd83044df26b {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 13:59:03 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] [instance: 90591d9b-6d6b-4f22-a3dc-fd83044df26b] End _get_guest_xml xml= Apr 21 13:59:03 user nova-compute[71474]: 90591d9b-6d6b-4f22-a3dc-fd83044df26b Apr 21 13:59:03 user nova-compute[71474]: instance-00000008 Apr 21 13:59:03 user nova-compute[71474]: 131072 Apr 21 13:59:03 user nova-compute[71474]: 1 Apr 21 13:59:03 user nova-compute[71474]: Apr 21 13:59:03 user nova-compute[71474]: Apr 21 13:59:03 user nova-compute[71474]: Apr 21 13:59:03 user nova-compute[71474]: tempest-DeleteServersTestJSON-server-1763512323 Apr 21 13:59:03 user nova-compute[71474]: 2023-04-21 13:59:03 Apr 21 13:59:03 user nova-compute[71474]: Apr 21 13:59:03 user nova-compute[71474]: 128 Apr 21 13:59:03 user nova-compute[71474]: 1 Apr 21 13:59:03 user nova-compute[71474]: 0 Apr 21 13:59:03 user nova-compute[71474]: 0 Apr 21 13:59:03 user nova-compute[71474]: 1 Apr 21 13:59:03 user nova-compute[71474]: Apr 21 13:59:03 user nova-compute[71474]: Apr 21 13:59:03 user nova-compute[71474]: tempest-DeleteServersTestJSON-356048122-project-member Apr 21 13:59:03 user nova-compute[71474]: tempest-DeleteServersTestJSON-356048122 Apr 21 13:59:03 user nova-compute[71474]: Apr 21 13:59:03 user nova-compute[71474]: Apr 21 13:59:03 user nova-compute[71474]: Apr 21 13:59:03 user nova-compute[71474]: Apr 21 13:59:03 user nova-compute[71474]: Apr 21 13:59:03 user nova-compute[71474]: Apr 21 13:59:03 user nova-compute[71474]: Apr 21 13:59:03 user nova-compute[71474]: Apr 21 13:59:03 user nova-compute[71474]: Apr 21 13:59:03 user nova-compute[71474]: Apr 21 13:59:03 user nova-compute[71474]: Apr 21 13:59:03 user nova-compute[71474]: OpenStack Foundation Apr 21 13:59:03 user nova-compute[71474]: OpenStack Nova Apr 21 13:59:03 user nova-compute[71474]: 0.0.0 Apr 21 13:59:03 user nova-compute[71474]: 90591d9b-6d6b-4f22-a3dc-fd83044df26b Apr 21 13:59:03 user nova-compute[71474]: 90591d9b-6d6b-4f22-a3dc-fd83044df26b Apr 21 13:59:03 user nova-compute[71474]: Virtual Machine Apr 21 13:59:03 user nova-compute[71474]: Apr 21 13:59:03 user nova-compute[71474]: Apr 21 13:59:03 user nova-compute[71474]: Apr 21 13:59:03 user nova-compute[71474]: hvm Apr 21 13:59:03 user nova-compute[71474]: Apr 21 13:59:03 user nova-compute[71474]: Apr 21 13:59:03 user nova-compute[71474]: Apr 21 13:59:03 user nova-compute[71474]: Apr 21 13:59:03 user nova-compute[71474]: Apr 21 13:59:03 user nova-compute[71474]: Apr 21 13:59:03 user nova-compute[71474]: Apr 21 13:59:03 user nova-compute[71474]: Apr 21 13:59:03 user nova-compute[71474]: Apr 21 13:59:03 user nova-compute[71474]: Apr 21 13:59:03 user nova-compute[71474]: Apr 21 13:59:03 user nova-compute[71474]: Apr 21 13:59:03 user nova-compute[71474]: Apr 21 13:59:03 user nova-compute[71474]: Apr 21 13:59:03 user nova-compute[71474]: Nehalem Apr 21 13:59:03 user nova-compute[71474]: Apr 21 13:59:03 user nova-compute[71474]: Apr 21 13:59:03 user nova-compute[71474]: Apr 21 13:59:03 user nova-compute[71474]: Apr 21 13:59:03 user nova-compute[71474]: Apr 21 13:59:03 user nova-compute[71474]: Apr 21 13:59:03 user nova-compute[71474]: Apr 21 13:59:03 user nova-compute[71474]: Apr 21 13:59:03 user nova-compute[71474]: Apr 21 13:59:03 user nova-compute[71474]: Apr 21 13:59:03 user nova-compute[71474]: Apr 21 13:59:03 user nova-compute[71474]: Apr 21 13:59:03 user nova-compute[71474]: Apr 21 13:59:03 user nova-compute[71474]: Apr 21 13:59:03 user nova-compute[71474]: Apr 21 13:59:03 user nova-compute[71474]: Apr 21 13:59:03 user nova-compute[71474]: Apr 21 13:59:03 user nova-compute[71474]: Apr 21 13:59:03 user nova-compute[71474]: Apr 21 13:59:03 user nova-compute[71474]: Apr 21 13:59:03 user nova-compute[71474]: /dev/urandom Apr 21 13:59:03 user nova-compute[71474]: Apr 21 13:59:03 user nova-compute[71474]: Apr 21 13:59:03 user nova-compute[71474]: Apr 21 13:59:03 user nova-compute[71474]: Apr 21 13:59:03 user nova-compute[71474]: Apr 21 13:59:03 user nova-compute[71474]: Apr 21 13:59:03 user nova-compute[71474]: Apr 21 13:59:03 user nova-compute[71474]: {{(pid=71474) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 21 13:59:03 user nova-compute[71474]: DEBUG nova.virt.libvirt.vif [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T13:59:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1763512323',display_name='tempest-DeleteServersTestJSON-server-1763512323',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-deleteserverstestjson-server-1763512323',id=8,image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f4add6cbbd424513a25dacd5dfb3adcf',ramdisk_id='',reservation_id='r-hbsf4k4y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-356048122',owner_user_name='tempest-DeleteServersTestJSON-356048122-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T13:59:01Z,user_data=None,user_id='f09a36fbec134e248aa2baec4bb6a53b',uuid=90591d9b-6d6b-4f22-a3dc-fd83044df26b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "806c5236-f27a-4616-9ec6-85a2585f753a", "address": "fa:16:3e:0e:86:45", "network": {"id": "d4871732-c28b-456f-8efe-8f0c46a4107d", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-86130370-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f4add6cbbd424513a25dacd5dfb3adcf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap806c5236-f2", "ovs_interfaceid": "806c5236-f27a-4616-9ec6-85a2585f753a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 21 13:59:03 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] Converting VIF {"id": "806c5236-f27a-4616-9ec6-85a2585f753a", "address": "fa:16:3e:0e:86:45", "network": {"id": "d4871732-c28b-456f-8efe-8f0c46a4107d", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-86130370-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f4add6cbbd424513a25dacd5dfb3adcf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap806c5236-f2", "ovs_interfaceid": "806c5236-f27a-4616-9ec6-85a2585f753a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 13:59:03 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0e:86:45,bridge_name='br-int',has_traffic_filtering=True,id=806c5236-f27a-4616-9ec6-85a2585f753a,network=Network(d4871732-c28b-456f-8efe-8f0c46a4107d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap806c5236-f2') {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 13:59:03 user nova-compute[71474]: DEBUG os_vif [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0e:86:45,bridge_name='br-int',has_traffic_filtering=True,id=806c5236-f27a-4616-9ec6-85a2585f753a,network=Network(d4871732-c28b-456f-8efe-8f0c46a4107d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap806c5236-f2') {{(pid=71474) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 21 13:59:03 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:59:03 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 13:59:03 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 21 13:59:03 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:59:03 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap806c5236-f2, may_exist=True) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 13:59:03 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap806c5236-f2, col_values=(('external_ids', {'iface-id': '806c5236-f27a-4616-9ec6-85a2585f753a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:0e:86:45', 'vm-uuid': '90591d9b-6d6b-4f22-a3dc-fd83044df26b'}),)) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 13:59:03 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:59:03 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 13:59:03 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:59:03 user nova-compute[71474]: INFO os_vif [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0e:86:45,bridge_name='br-int',has_traffic_filtering=True,id=806c5236-f27a-4616-9ec6-85a2585f753a,network=Network(d4871732-c28b-456f-8efe-8f0c46a4107d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap806c5236-f2') Apr 21 13:59:03 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] No BDM found with device name vda, not building metadata. {{(pid=71474) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 21 13:59:03 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] No VIF found with MAC fa:16:3e:0e:86:45, not building metadata {{(pid=71474) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 21 13:59:04 user nova-compute[71474]: DEBUG nova.network.neutron [req-e0f6489f-531b-487d-8bec-ae54425a1ac1 req-8b1e8319-8def-4180-8bd3-a808e280cf1a service nova] [instance: 90591d9b-6d6b-4f22-a3dc-fd83044df26b] Updated VIF entry in instance network info cache for port 806c5236-f27a-4616-9ec6-85a2585f753a. {{(pid=71474) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 21 13:59:04 user nova-compute[71474]: DEBUG nova.network.neutron [req-e0f6489f-531b-487d-8bec-ae54425a1ac1 req-8b1e8319-8def-4180-8bd3-a808e280cf1a service nova] [instance: 90591d9b-6d6b-4f22-a3dc-fd83044df26b] Updating instance_info_cache with network_info: [{"id": "806c5236-f27a-4616-9ec6-85a2585f753a", "address": "fa:16:3e:0e:86:45", "network": {"id": "d4871732-c28b-456f-8efe-8f0c46a4107d", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-86130370-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f4add6cbbd424513a25dacd5dfb3adcf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap806c5236-f2", "ovs_interfaceid": "806c5236-f27a-4616-9ec6-85a2585f753a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 13:59:04 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-e0f6489f-531b-487d-8bec-ae54425a1ac1 req-8b1e8319-8def-4180-8bd3-a808e280cf1a service nova] Releasing lock "refresh_cache-90591d9b-6d6b-4f22-a3dc-fd83044df26b" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 13:59:04 user nova-compute[71474]: DEBUG nova.virt.driver [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Emitting event Resumed> {{(pid=71474) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 13:59:04 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] VM Resumed (Lifecycle Event) Apr 21 13:59:04 user nova-compute[71474]: DEBUG nova.compute.manager [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] Instance event wait completed in 0 seconds for {{(pid=71474) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 21 13:59:04 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] Guest created on hypervisor {{(pid=71474) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 21 13:59:04 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 13:59:04 user nova-compute[71474]: INFO nova.virt.libvirt.driver [-] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] Instance spawned successfully. Apr 21 13:59:04 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 21 13:59:04 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71474) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 13:59:04 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] Found default for hw_cdrom_bus of ide {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 13:59:04 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] Found default for hw_disk_bus of virtio {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 13:59:04 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] Found default for hw_input_bus of None {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 13:59:04 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] Found default for hw_pointer_model of None {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 13:59:04 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] Found default for hw_video_model of virtio {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 13:59:04 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] Found default for hw_vif_model of virtio {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 13:59:04 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 13:59:04 user nova-compute[71474]: DEBUG nova.virt.driver [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Emitting event Started> {{(pid=71474) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 13:59:04 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] VM Started (Lifecycle Event) Apr 21 13:59:04 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 13:59:04 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71474) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 13:59:04 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 13:59:04 user nova-compute[71474]: INFO nova.compute.manager [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] Took 6.78 seconds to spawn the instance on the hypervisor. Apr 21 13:59:04 user nova-compute[71474]: DEBUG nova.compute.manager [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 13:59:04 user nova-compute[71474]: INFO nova.compute.manager [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] Took 8.24 seconds to build instance. Apr 21 13:59:04 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-bca69c3f-1b50-450e-9cdf-a479ba0551d0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Lock "5e502c4c-a46b-4670-acba-2fda2d05adf5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 8.353s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:59:04 user nova-compute[71474]: DEBUG nova.compute.manager [req-40ebf43c-0d36-455a-8a45-71e2cd2ddeff req-e205e987-6418-4b7f-9188-d07fb5eb01bf service nova] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] Received event network-vif-plugged-9ba354a7-6fb2-4eb1-96f4-edb58950895e {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 13:59:04 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-40ebf43c-0d36-455a-8a45-71e2cd2ddeff req-e205e987-6418-4b7f-9188-d07fb5eb01bf service nova] Acquiring lock "5e502c4c-a46b-4670-acba-2fda2d05adf5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:59:04 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-40ebf43c-0d36-455a-8a45-71e2cd2ddeff req-e205e987-6418-4b7f-9188-d07fb5eb01bf service nova] Lock "5e502c4c-a46b-4670-acba-2fda2d05adf5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:59:04 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-40ebf43c-0d36-455a-8a45-71e2cd2ddeff req-e205e987-6418-4b7f-9188-d07fb5eb01bf service nova] Lock "5e502c4c-a46b-4670-acba-2fda2d05adf5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:59:04 user nova-compute[71474]: DEBUG nova.compute.manager [req-40ebf43c-0d36-455a-8a45-71e2cd2ddeff req-e205e987-6418-4b7f-9188-d07fb5eb01bf service nova] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] No waiting events found dispatching network-vif-plugged-9ba354a7-6fb2-4eb1-96f4-edb58950895e {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 13:59:04 user nova-compute[71474]: WARNING nova.compute.manager [req-40ebf43c-0d36-455a-8a45-71e2cd2ddeff req-e205e987-6418-4b7f-9188-d07fb5eb01bf service nova] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] Received unexpected event network-vif-plugged-9ba354a7-6fb2-4eb1-96f4-edb58950895e for instance with vm_state active and task_state None. Apr 21 13:59:04 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:59:04 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:59:04 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:59:05 user nova-compute[71474]: DEBUG nova.compute.manager [req-957327f2-9111-40c6-8d73-54a25d279800 req-c4c3de66-0b2a-4a20-b403-6ea82d920d01 service nova] [instance: 90591d9b-6d6b-4f22-a3dc-fd83044df26b] Received event network-vif-plugged-806c5236-f27a-4616-9ec6-85a2585f753a {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 13:59:05 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-957327f2-9111-40c6-8d73-54a25d279800 req-c4c3de66-0b2a-4a20-b403-6ea82d920d01 service nova] Acquiring lock "90591d9b-6d6b-4f22-a3dc-fd83044df26b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:59:05 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-957327f2-9111-40c6-8d73-54a25d279800 req-c4c3de66-0b2a-4a20-b403-6ea82d920d01 service nova] Lock "90591d9b-6d6b-4f22-a3dc-fd83044df26b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:59:05 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-957327f2-9111-40c6-8d73-54a25d279800 req-c4c3de66-0b2a-4a20-b403-6ea82d920d01 service nova] Lock "90591d9b-6d6b-4f22-a3dc-fd83044df26b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:59:05 user nova-compute[71474]: DEBUG nova.compute.manager [req-957327f2-9111-40c6-8d73-54a25d279800 req-c4c3de66-0b2a-4a20-b403-6ea82d920d01 service nova] [instance: 90591d9b-6d6b-4f22-a3dc-fd83044df26b] No waiting events found dispatching network-vif-plugged-806c5236-f27a-4616-9ec6-85a2585f753a {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 13:59:05 user nova-compute[71474]: WARNING nova.compute.manager [req-957327f2-9111-40c6-8d73-54a25d279800 req-c4c3de66-0b2a-4a20-b403-6ea82d920d01 service nova] [instance: 90591d9b-6d6b-4f22-a3dc-fd83044df26b] Received unexpected event network-vif-plugged-806c5236-f27a-4616-9ec6-85a2585f753a for instance with vm_state building and task_state spawning. Apr 21 13:59:05 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:59:05 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:59:05 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:59:05 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:59:07 user nova-compute[71474]: DEBUG nova.virt.driver [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Emitting event Resumed> {{(pid=71474) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 13:59:07 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 90591d9b-6d6b-4f22-a3dc-fd83044df26b] VM Resumed (Lifecycle Event) Apr 21 13:59:07 user nova-compute[71474]: DEBUG nova.compute.manager [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] [instance: 90591d9b-6d6b-4f22-a3dc-fd83044df26b] Instance event wait completed in 0 seconds for {{(pid=71474) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 21 13:59:07 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] [instance: 90591d9b-6d6b-4f22-a3dc-fd83044df26b] Guest created on hypervisor {{(pid=71474) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 21 13:59:07 user nova-compute[71474]: INFO nova.virt.libvirt.driver [-] [instance: 90591d9b-6d6b-4f22-a3dc-fd83044df26b] Instance spawned successfully. Apr 21 13:59:07 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] [instance: 90591d9b-6d6b-4f22-a3dc-fd83044df26b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 21 13:59:07 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 90591d9b-6d6b-4f22-a3dc-fd83044df26b] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 13:59:07 user nova-compute[71474]: DEBUG nova.compute.manager [req-d6f64c58-d271-4231-8d85-13b0a3832416 req-14a5fbc1-635d-49e2-b9e2-ac36e73309c9 service nova] [instance: 90591d9b-6d6b-4f22-a3dc-fd83044df26b] Received event network-vif-plugged-806c5236-f27a-4616-9ec6-85a2585f753a {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 13:59:07 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-d6f64c58-d271-4231-8d85-13b0a3832416 req-14a5fbc1-635d-49e2-b9e2-ac36e73309c9 service nova] Acquiring lock "90591d9b-6d6b-4f22-a3dc-fd83044df26b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:59:07 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-d6f64c58-d271-4231-8d85-13b0a3832416 req-14a5fbc1-635d-49e2-b9e2-ac36e73309c9 service nova] Lock "90591d9b-6d6b-4f22-a3dc-fd83044df26b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:59:07 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-d6f64c58-d271-4231-8d85-13b0a3832416 req-14a5fbc1-635d-49e2-b9e2-ac36e73309c9 service nova] Lock "90591d9b-6d6b-4f22-a3dc-fd83044df26b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:59:07 user nova-compute[71474]: DEBUG nova.compute.manager [req-d6f64c58-d271-4231-8d85-13b0a3832416 req-14a5fbc1-635d-49e2-b9e2-ac36e73309c9 service nova] [instance: 90591d9b-6d6b-4f22-a3dc-fd83044df26b] No waiting events found dispatching network-vif-plugged-806c5236-f27a-4616-9ec6-85a2585f753a {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 13:59:07 user nova-compute[71474]: WARNING nova.compute.manager [req-d6f64c58-d271-4231-8d85-13b0a3832416 req-14a5fbc1-635d-49e2-b9e2-ac36e73309c9 service nova] [instance: 90591d9b-6d6b-4f22-a3dc-fd83044df26b] Received unexpected event network-vif-plugged-806c5236-f27a-4616-9ec6-85a2585f753a for instance with vm_state building and task_state spawning. Apr 21 13:59:07 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 90591d9b-6d6b-4f22-a3dc-fd83044df26b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71474) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 13:59:07 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] [instance: 90591d9b-6d6b-4f22-a3dc-fd83044df26b] Found default for hw_cdrom_bus of ide {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 13:59:07 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] [instance: 90591d9b-6d6b-4f22-a3dc-fd83044df26b] Found default for hw_disk_bus of virtio {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 13:59:07 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] [instance: 90591d9b-6d6b-4f22-a3dc-fd83044df26b] Found default for hw_input_bus of None {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 13:59:07 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] [instance: 90591d9b-6d6b-4f22-a3dc-fd83044df26b] Found default for hw_pointer_model of None {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 13:59:07 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] [instance: 90591d9b-6d6b-4f22-a3dc-fd83044df26b] Found default for hw_video_model of virtio {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 13:59:07 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] [instance: 90591d9b-6d6b-4f22-a3dc-fd83044df26b] Found default for hw_vif_model of virtio {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 13:59:07 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 90591d9b-6d6b-4f22-a3dc-fd83044df26b] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 13:59:07 user nova-compute[71474]: DEBUG nova.virt.driver [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Emitting event Started> {{(pid=71474) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 13:59:07 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 90591d9b-6d6b-4f22-a3dc-fd83044df26b] VM Started (Lifecycle Event) Apr 21 13:59:07 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 90591d9b-6d6b-4f22-a3dc-fd83044df26b] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 13:59:07 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 90591d9b-6d6b-4f22-a3dc-fd83044df26b] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71474) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 13:59:07 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 90591d9b-6d6b-4f22-a3dc-fd83044df26b] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 13:59:07 user nova-compute[71474]: INFO nova.compute.manager [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] [instance: 90591d9b-6d6b-4f22-a3dc-fd83044df26b] Took 5.91 seconds to spawn the instance on the hypervisor. Apr 21 13:59:07 user nova-compute[71474]: DEBUG nova.compute.manager [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] [instance: 90591d9b-6d6b-4f22-a3dc-fd83044df26b] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 13:59:07 user nova-compute[71474]: INFO nova.compute.manager [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] [instance: 90591d9b-6d6b-4f22-a3dc-fd83044df26b] Took 6.62 seconds to build instance. Apr 21 13:59:07 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-4be3eed1-17a5-4293-95fa-755d3d308c32 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] Lock "90591d9b-6d6b-4f22-a3dc-fd83044df26b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.722s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:59:08 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:59:09 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:59:10 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:59:12 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:59:13 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:59:15 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:59:15 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:59:18 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:59:18 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:59:20 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:59:21 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] Acquiring lock "2ae07df3-4bf4-44a5-a772-3507a6dde6ab" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:59:21 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] Lock "2ae07df3-4bf4-44a5-a772-3507a6dde6ab" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:59:21 user nova-compute[71474]: DEBUG nova.compute.manager [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] [instance: 2ae07df3-4bf4-44a5-a772-3507a6dde6ab] Starting instance... {{(pid=71474) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 21 13:59:21 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:59:21 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:59:21 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71474) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 21 13:59:21 user nova-compute[71474]: INFO nova.compute.claims [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] [instance: 2ae07df3-4bf4-44a5-a772-3507a6dde6ab] Claim successful on node user Apr 21 13:59:21 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] Inventory has not changed in ProviderTree for provider: 4e62c1ab-67bb-43ed-8389-61deb50e98d7 {{(pid=71474) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 13:59:21 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] Inventory has not changed for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 13:59:21 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.368s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:59:21 user nova-compute[71474]: DEBUG nova.compute.manager [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] [instance: 2ae07df3-4bf4-44a5-a772-3507a6dde6ab] Start building networks asynchronously for instance. {{(pid=71474) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 21 13:59:21 user nova-compute[71474]: DEBUG nova.compute.manager [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] [instance: 2ae07df3-4bf4-44a5-a772-3507a6dde6ab] Allocating IP information in the background. {{(pid=71474) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 21 13:59:21 user nova-compute[71474]: DEBUG nova.network.neutron [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] [instance: 2ae07df3-4bf4-44a5-a772-3507a6dde6ab] allocate_for_instance() {{(pid=71474) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 21 13:59:21 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] [instance: 2ae07df3-4bf4-44a5-a772-3507a6dde6ab] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 21 13:59:21 user nova-compute[71474]: DEBUG nova.compute.manager [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] [instance: 2ae07df3-4bf4-44a5-a772-3507a6dde6ab] Start building block device mappings for instance. {{(pid=71474) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 21 13:59:22 user nova-compute[71474]: DEBUG nova.policy [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1a2438d69a684df69e1de2edddc73bc0', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9daf036d4ad84586a628c454408e3d7d', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71474) authorize /opt/stack/nova/nova/policy.py:203}} Apr 21 13:59:22 user nova-compute[71474]: DEBUG nova.compute.manager [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] [instance: 2ae07df3-4bf4-44a5-a772-3507a6dde6ab] Start spawning the instance on the hypervisor. {{(pid=71474) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 21 13:59:22 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] [instance: 2ae07df3-4bf4-44a5-a772-3507a6dde6ab] Creating instance directory {{(pid=71474) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 21 13:59:22 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] [instance: 2ae07df3-4bf4-44a5-a772-3507a6dde6ab] Creating image(s) Apr 21 13:59:22 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] Acquiring lock "/opt/stack/data/nova/instances/2ae07df3-4bf4-44a5-a772-3507a6dde6ab/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:59:22 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] Lock "/opt/stack/data/nova/instances/2ae07df3-4bf4-44a5-a772-3507a6dde6ab/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:59:22 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] Lock "/opt/stack/data/nova/instances/2ae07df3-4bf4-44a5-a772-3507a6dde6ab/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:59:22 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:59:22 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json" returned: 0 in 0.136s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:59:22 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] Acquiring lock "8e8c288cb98f22f6af31ad55f38b7baa81c260d7" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:59:22 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] Lock "8e8c288cb98f22f6af31ad55f38b7baa81c260d7" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:59:22 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:59:22 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json" returned: 0 in 0.130s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:59:22 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7,backing_fmt=raw /opt/stack/data/nova/instances/2ae07df3-4bf4-44a5-a772-3507a6dde6ab/disk 1073741824 {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:59:22 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7,backing_fmt=raw /opt/stack/data/nova/instances/2ae07df3-4bf4-44a5-a772-3507a6dde6ab/disk 1073741824" returned: 0 in 0.047s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:59:22 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] Lock "8e8c288cb98f22f6af31ad55f38b7baa81c260d7" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.185s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:59:22 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:59:22 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json" returned: 0 in 0.139s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:59:22 user nova-compute[71474]: DEBUG nova.virt.disk.api [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] Checking if we can resize image /opt/stack/data/nova/instances/2ae07df3-4bf4-44a5-a772-3507a6dde6ab/disk. size=1073741824 {{(pid=71474) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 21 13:59:22 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2ae07df3-4bf4-44a5-a772-3507a6dde6ab/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:59:22 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2ae07df3-4bf4-44a5-a772-3507a6dde6ab/disk --force-share --output=json" returned: 0 in 0.133s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:59:22 user nova-compute[71474]: DEBUG nova.virt.disk.api [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] Cannot resize image /opt/stack/data/nova/instances/2ae07df3-4bf4-44a5-a772-3507a6dde6ab/disk to a smaller size. {{(pid=71474) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 21 13:59:22 user nova-compute[71474]: DEBUG nova.objects.instance [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] Lazy-loading 'migration_context' on Instance uuid 2ae07df3-4bf4-44a5-a772-3507a6dde6ab {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 13:59:22 user nova-compute[71474]: DEBUG nova.network.neutron [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] [instance: 2ae07df3-4bf4-44a5-a772-3507a6dde6ab] Successfully created port: 15758bd1-cf67-4bc6-9408-9740cd79d26d {{(pid=71474) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 21 13:59:22 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] [instance: 2ae07df3-4bf4-44a5-a772-3507a6dde6ab] Created local disks {{(pid=71474) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 21 13:59:22 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] [instance: 2ae07df3-4bf4-44a5-a772-3507a6dde6ab] Ensure instance console log exists: /opt/stack/data/nova/instances/2ae07df3-4bf4-44a5-a772-3507a6dde6ab/console.log {{(pid=71474) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 21 13:59:22 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:59:22 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:59:22 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:59:23 user nova-compute[71474]: DEBUG nova.network.neutron [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] [instance: 2ae07df3-4bf4-44a5-a772-3507a6dde6ab] Successfully updated port: 15758bd1-cf67-4bc6-9408-9740cd79d26d {{(pid=71474) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 21 13:59:23 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] Acquiring lock "refresh_cache-2ae07df3-4bf4-44a5-a772-3507a6dde6ab" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 13:59:23 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] Acquired lock "refresh_cache-2ae07df3-4bf4-44a5-a772-3507a6dde6ab" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 13:59:23 user nova-compute[71474]: DEBUG nova.network.neutron [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] [instance: 2ae07df3-4bf4-44a5-a772-3507a6dde6ab] Building network info cache for instance {{(pid=71474) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 21 13:59:23 user nova-compute[71474]: DEBUG nova.compute.manager [req-6cf601c3-6d56-4beb-84bf-0900cee834ed req-d9808f4f-e109-42f3-9a30-46d844824189 service nova] [instance: 2ae07df3-4bf4-44a5-a772-3507a6dde6ab] Received event network-changed-15758bd1-cf67-4bc6-9408-9740cd79d26d {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 13:59:23 user nova-compute[71474]: DEBUG nova.compute.manager [req-6cf601c3-6d56-4beb-84bf-0900cee834ed req-d9808f4f-e109-42f3-9a30-46d844824189 service nova] [instance: 2ae07df3-4bf4-44a5-a772-3507a6dde6ab] Refreshing instance network info cache due to event network-changed-15758bd1-cf67-4bc6-9408-9740cd79d26d. {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 21 13:59:23 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-6cf601c3-6d56-4beb-84bf-0900cee834ed req-d9808f4f-e109-42f3-9a30-46d844824189 service nova] Acquiring lock "refresh_cache-2ae07df3-4bf4-44a5-a772-3507a6dde6ab" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 13:59:23 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:59:23 user nova-compute[71474]: DEBUG nova.network.neutron [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] [instance: 2ae07df3-4bf4-44a5-a772-3507a6dde6ab] Instance cache missing network info. {{(pid=71474) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 21 13:59:23 user nova-compute[71474]: DEBUG nova.network.neutron [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] [instance: 2ae07df3-4bf4-44a5-a772-3507a6dde6ab] Updating instance_info_cache with network_info: [{"id": "15758bd1-cf67-4bc6-9408-9740cd79d26d", "address": "fa:16:3e:19:e6:4f", "network": {"id": "ee3596e4-eb99-458a-b7ae-a48f8bcd58c7", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-1394515631-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "9daf036d4ad84586a628c454408e3d7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap15758bd1-cf", "ovs_interfaceid": "15758bd1-cf67-4bc6-9408-9740cd79d26d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 13:59:23 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] Releasing lock "refresh_cache-2ae07df3-4bf4-44a5-a772-3507a6dde6ab" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 13:59:23 user nova-compute[71474]: DEBUG nova.compute.manager [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] [instance: 2ae07df3-4bf4-44a5-a772-3507a6dde6ab] Instance network_info: |[{"id": "15758bd1-cf67-4bc6-9408-9740cd79d26d", "address": "fa:16:3e:19:e6:4f", "network": {"id": "ee3596e4-eb99-458a-b7ae-a48f8bcd58c7", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-1394515631-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "9daf036d4ad84586a628c454408e3d7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap15758bd1-cf", "ovs_interfaceid": "15758bd1-cf67-4bc6-9408-9740cd79d26d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71474) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 21 13:59:23 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-6cf601c3-6d56-4beb-84bf-0900cee834ed req-d9808f4f-e109-42f3-9a30-46d844824189 service nova] Acquired lock "refresh_cache-2ae07df3-4bf4-44a5-a772-3507a6dde6ab" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 13:59:23 user nova-compute[71474]: DEBUG nova.network.neutron [req-6cf601c3-6d56-4beb-84bf-0900cee834ed req-d9808f4f-e109-42f3-9a30-46d844824189 service nova] [instance: 2ae07df3-4bf4-44a5-a772-3507a6dde6ab] Refreshing network info cache for port 15758bd1-cf67-4bc6-9408-9740cd79d26d {{(pid=71474) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 21 13:59:23 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] [instance: 2ae07df3-4bf4-44a5-a772-3507a6dde6ab] Start _get_guest_xml network_info=[{"id": "15758bd1-cf67-4bc6-9408-9740cd79d26d", "address": "fa:16:3e:19:e6:4f", "network": {"id": "ee3596e4-eb99-458a-b7ae-a48f8bcd58c7", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-1394515631-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "9daf036d4ad84586a628c454408e3d7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap15758bd1-cf", "ovs_interfaceid": "15758bd1-cf67-4bc6-9408-9740cd79d26d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T13:54:16Z,direct_url=,disk_format='qcow2',id=2edfef44-2867-4e03-a53e-b139f99afa75,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='36a44032fda748c1965c722304fa176d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T13:54:18Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'guest_format': None, 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'image_id': '2edfef44-2867-4e03-a53e-b139f99afa75'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71474) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 21 13:59:23 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 13:59:23 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 13:59:23 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71474) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 21 13:59:23 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-21T13:55:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T13:54:16Z,direct_url=,disk_format='qcow2',id=2edfef44-2867-4e03-a53e-b139f99afa75,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='36a44032fda748c1965c722304fa176d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T13:54:18Z,virtual_size=,visibility=), allow threads: True {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 21 13:59:23 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] Flavor limits 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 21 13:59:23 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] Image limits 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 21 13:59:23 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] Flavor pref 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 21 13:59:23 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] Image pref 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 21 13:59:23 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 21 13:59:23 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 21 13:59:23 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71474) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 21 13:59:23 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] Got 1 possible topologies {{(pid=71474) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 21 13:59:23 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 21 13:59:23 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 21 13:59:23 user nova-compute[71474]: DEBUG nova.virt.libvirt.vif [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T13:59:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-SnapshotDataIntegrityTests-server-781003247',display_name='tempest-SnapshotDataIntegrityTests-server-781003247',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-snapshotdataintegritytests-server-781003247',id=9,image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKR10aEOjapD46pprj3PVJ6Gx5H8VQYM8ILWN6S/kFyLgzYn0q969VADuMdlZwliZmFRI4vN1i/LRJPIe9UBMJ19tPqMq6iASFPFIt5SbKZXfihMoI1E5AGB6AltaW1bLw==',key_name='tempest-SnapshotDataIntegrityTests-1470211823',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9daf036d4ad84586a628c454408e3d7d',ramdisk_id='',reservation_id='r-bi8kumkc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-SnapshotDataIntegrityTests-1600761065',owner_user_name='tempest-SnapshotDataIntegrityTests-1600761065-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T13:59:22Z,user_data=None,user_id='1a2438d69a684df69e1de2edddc73bc0',uuid=2ae07df3-4bf4-44a5-a772-3507a6dde6ab,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "15758bd1-cf67-4bc6-9408-9740cd79d26d", "address": "fa:16:3e:19:e6:4f", "network": {"id": "ee3596e4-eb99-458a-b7ae-a48f8bcd58c7", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-1394515631-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "9daf036d4ad84586a628c454408e3d7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap15758bd1-cf", "ovs_interfaceid": "15758bd1-cf67-4bc6-9408-9740cd79d26d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71474) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 21 13:59:23 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] Converting VIF {"id": "15758bd1-cf67-4bc6-9408-9740cd79d26d", "address": "fa:16:3e:19:e6:4f", "network": {"id": "ee3596e4-eb99-458a-b7ae-a48f8bcd58c7", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-1394515631-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "9daf036d4ad84586a628c454408e3d7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap15758bd1-cf", "ovs_interfaceid": "15758bd1-cf67-4bc6-9408-9740cd79d26d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 13:59:24 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:19:e6:4f,bridge_name='br-int',has_traffic_filtering=True,id=15758bd1-cf67-4bc6-9408-9740cd79d26d,network=Network(ee3596e4-eb99-458a-b7ae-a48f8bcd58c7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap15758bd1-cf') {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 13:59:24 user nova-compute[71474]: DEBUG nova.objects.instance [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] Lazy-loading 'pci_devices' on Instance uuid 2ae07df3-4bf4-44a5-a772-3507a6dde6ab {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 13:59:24 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] [instance: 2ae07df3-4bf4-44a5-a772-3507a6dde6ab] End _get_guest_xml xml= Apr 21 13:59:24 user nova-compute[71474]: 2ae07df3-4bf4-44a5-a772-3507a6dde6ab Apr 21 13:59:24 user nova-compute[71474]: instance-00000009 Apr 21 13:59:24 user nova-compute[71474]: 131072 Apr 21 13:59:24 user nova-compute[71474]: 1 Apr 21 13:59:24 user nova-compute[71474]: Apr 21 13:59:24 user nova-compute[71474]: Apr 21 13:59:24 user nova-compute[71474]: Apr 21 13:59:24 user nova-compute[71474]: tempest-SnapshotDataIntegrityTests-server-781003247 Apr 21 13:59:24 user nova-compute[71474]: 2023-04-21 13:59:23 Apr 21 13:59:24 user nova-compute[71474]: Apr 21 13:59:24 user nova-compute[71474]: 128 Apr 21 13:59:24 user nova-compute[71474]: 1 Apr 21 13:59:24 user nova-compute[71474]: 0 Apr 21 13:59:24 user nova-compute[71474]: 0 Apr 21 13:59:24 user nova-compute[71474]: 1 Apr 21 13:59:24 user nova-compute[71474]: Apr 21 13:59:24 user nova-compute[71474]: Apr 21 13:59:24 user nova-compute[71474]: tempest-SnapshotDataIntegrityTests-1600761065-project-member Apr 21 13:59:24 user nova-compute[71474]: tempest-SnapshotDataIntegrityTests-1600761065 Apr 21 13:59:24 user nova-compute[71474]: Apr 21 13:59:24 user nova-compute[71474]: Apr 21 13:59:24 user nova-compute[71474]: Apr 21 13:59:24 user nova-compute[71474]: Apr 21 13:59:24 user nova-compute[71474]: Apr 21 13:59:24 user nova-compute[71474]: Apr 21 13:59:24 user nova-compute[71474]: Apr 21 13:59:24 user nova-compute[71474]: Apr 21 13:59:24 user nova-compute[71474]: Apr 21 13:59:24 user nova-compute[71474]: Apr 21 13:59:24 user nova-compute[71474]: Apr 21 13:59:24 user nova-compute[71474]: OpenStack Foundation Apr 21 13:59:24 user nova-compute[71474]: OpenStack Nova Apr 21 13:59:24 user nova-compute[71474]: 0.0.0 Apr 21 13:59:24 user nova-compute[71474]: 2ae07df3-4bf4-44a5-a772-3507a6dde6ab Apr 21 13:59:24 user nova-compute[71474]: 2ae07df3-4bf4-44a5-a772-3507a6dde6ab Apr 21 13:59:24 user nova-compute[71474]: Virtual Machine Apr 21 13:59:24 user nova-compute[71474]: Apr 21 13:59:24 user nova-compute[71474]: Apr 21 13:59:24 user nova-compute[71474]: Apr 21 13:59:24 user nova-compute[71474]: hvm Apr 21 13:59:24 user nova-compute[71474]: Apr 21 13:59:24 user nova-compute[71474]: Apr 21 13:59:24 user nova-compute[71474]: Apr 21 13:59:24 user nova-compute[71474]: Apr 21 13:59:24 user nova-compute[71474]: Apr 21 13:59:24 user nova-compute[71474]: Apr 21 13:59:24 user nova-compute[71474]: Apr 21 13:59:24 user nova-compute[71474]: Apr 21 13:59:24 user nova-compute[71474]: Apr 21 13:59:24 user nova-compute[71474]: Apr 21 13:59:24 user nova-compute[71474]: Apr 21 13:59:24 user nova-compute[71474]: Apr 21 13:59:24 user nova-compute[71474]: Apr 21 13:59:24 user nova-compute[71474]: Apr 21 13:59:24 user nova-compute[71474]: Nehalem Apr 21 13:59:24 user nova-compute[71474]: Apr 21 13:59:24 user nova-compute[71474]: Apr 21 13:59:24 user nova-compute[71474]: Apr 21 13:59:24 user nova-compute[71474]: Apr 21 13:59:24 user nova-compute[71474]: Apr 21 13:59:24 user nova-compute[71474]: Apr 21 13:59:24 user nova-compute[71474]: Apr 21 13:59:24 user nova-compute[71474]: Apr 21 13:59:24 user nova-compute[71474]: Apr 21 13:59:24 user nova-compute[71474]: Apr 21 13:59:24 user nova-compute[71474]: Apr 21 13:59:24 user nova-compute[71474]: Apr 21 13:59:24 user nova-compute[71474]: Apr 21 13:59:24 user nova-compute[71474]: Apr 21 13:59:24 user nova-compute[71474]: Apr 21 13:59:24 user nova-compute[71474]: Apr 21 13:59:24 user nova-compute[71474]: Apr 21 13:59:24 user nova-compute[71474]: Apr 21 13:59:24 user nova-compute[71474]: Apr 21 13:59:24 user nova-compute[71474]: Apr 21 13:59:24 user nova-compute[71474]: /dev/urandom Apr 21 13:59:24 user nova-compute[71474]: Apr 21 13:59:24 user nova-compute[71474]: Apr 21 13:59:24 user nova-compute[71474]: Apr 21 13:59:24 user nova-compute[71474]: Apr 21 13:59:24 user nova-compute[71474]: Apr 21 13:59:24 user nova-compute[71474]: Apr 21 13:59:24 user nova-compute[71474]: Apr 21 13:59:24 user nova-compute[71474]: {{(pid=71474) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 21 13:59:24 user nova-compute[71474]: DEBUG nova.virt.libvirt.vif [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T13:59:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-SnapshotDataIntegrityTests-server-781003247',display_name='tempest-SnapshotDataIntegrityTests-server-781003247',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-snapshotdataintegritytests-server-781003247',id=9,image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKR10aEOjapD46pprj3PVJ6Gx5H8VQYM8ILWN6S/kFyLgzYn0q969VADuMdlZwliZmFRI4vN1i/LRJPIe9UBMJ19tPqMq6iASFPFIt5SbKZXfihMoI1E5AGB6AltaW1bLw==',key_name='tempest-SnapshotDataIntegrityTests-1470211823',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9daf036d4ad84586a628c454408e3d7d',ramdisk_id='',reservation_id='r-bi8kumkc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-SnapshotDataIntegrityTests-1600761065',owner_user_name='tempest-SnapshotDataIntegrityTests-1600761065-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T13:59:22Z,user_data=None,user_id='1a2438d69a684df69e1de2edddc73bc0',uuid=2ae07df3-4bf4-44a5-a772-3507a6dde6ab,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "15758bd1-cf67-4bc6-9408-9740cd79d26d", "address": "fa:16:3e:19:e6:4f", "network": {"id": "ee3596e4-eb99-458a-b7ae-a48f8bcd58c7", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-1394515631-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "9daf036d4ad84586a628c454408e3d7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap15758bd1-cf", "ovs_interfaceid": "15758bd1-cf67-4bc6-9408-9740cd79d26d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 21 13:59:24 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] Converting VIF {"id": "15758bd1-cf67-4bc6-9408-9740cd79d26d", "address": "fa:16:3e:19:e6:4f", "network": {"id": "ee3596e4-eb99-458a-b7ae-a48f8bcd58c7", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-1394515631-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "9daf036d4ad84586a628c454408e3d7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap15758bd1-cf", "ovs_interfaceid": "15758bd1-cf67-4bc6-9408-9740cd79d26d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 13:59:24 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:19:e6:4f,bridge_name='br-int',has_traffic_filtering=True,id=15758bd1-cf67-4bc6-9408-9740cd79d26d,network=Network(ee3596e4-eb99-458a-b7ae-a48f8bcd58c7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap15758bd1-cf') {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 13:59:24 user nova-compute[71474]: DEBUG os_vif [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:19:e6:4f,bridge_name='br-int',has_traffic_filtering=True,id=15758bd1-cf67-4bc6-9408-9740cd79d26d,network=Network(ee3596e4-eb99-458a-b7ae-a48f8bcd58c7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap15758bd1-cf') {{(pid=71474) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 21 13:59:24 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:59:24 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 13:59:24 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 21 13:59:24 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:59:24 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap15758bd1-cf, may_exist=True) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 13:59:24 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap15758bd1-cf, col_values=(('external_ids', {'iface-id': '15758bd1-cf67-4bc6-9408-9740cd79d26d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:19:e6:4f', 'vm-uuid': '2ae07df3-4bf4-44a5-a772-3507a6dde6ab'}),)) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 13:59:24 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:59:24 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 13:59:24 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:59:24 user nova-compute[71474]: INFO os_vif [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:19:e6:4f,bridge_name='br-int',has_traffic_filtering=True,id=15758bd1-cf67-4bc6-9408-9740cd79d26d,network=Network(ee3596e4-eb99-458a-b7ae-a48f8bcd58c7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap15758bd1-cf') Apr 21 13:59:24 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] No BDM found with device name vda, not building metadata. {{(pid=71474) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 21 13:59:24 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] No VIF found with MAC fa:16:3e:19:e6:4f, not building metadata {{(pid=71474) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 21 13:59:24 user nova-compute[71474]: DEBUG nova.network.neutron [req-6cf601c3-6d56-4beb-84bf-0900cee834ed req-d9808f4f-e109-42f3-9a30-46d844824189 service nova] [instance: 2ae07df3-4bf4-44a5-a772-3507a6dde6ab] Updated VIF entry in instance network info cache for port 15758bd1-cf67-4bc6-9408-9740cd79d26d. {{(pid=71474) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 21 13:59:24 user nova-compute[71474]: DEBUG nova.network.neutron [req-6cf601c3-6d56-4beb-84bf-0900cee834ed req-d9808f4f-e109-42f3-9a30-46d844824189 service nova] [instance: 2ae07df3-4bf4-44a5-a772-3507a6dde6ab] Updating instance_info_cache with network_info: [{"id": "15758bd1-cf67-4bc6-9408-9740cd79d26d", "address": "fa:16:3e:19:e6:4f", "network": {"id": "ee3596e4-eb99-458a-b7ae-a48f8bcd58c7", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-1394515631-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "9daf036d4ad84586a628c454408e3d7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap15758bd1-cf", "ovs_interfaceid": "15758bd1-cf67-4bc6-9408-9740cd79d26d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 13:59:24 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-6cf601c3-6d56-4beb-84bf-0900cee834ed req-d9808f4f-e109-42f3-9a30-46d844824189 service nova] Releasing lock "refresh_cache-2ae07df3-4bf4-44a5-a772-3507a6dde6ab" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 13:59:25 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:59:25 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:59:25 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:59:25 user nova-compute[71474]: DEBUG nova.compute.manager [req-c434ff41-92fa-40e6-93a5-1fc0933f1d4e req-30c75858-3f81-40a7-9c34-e5ebe0986d0b service nova] [instance: 2ae07df3-4bf4-44a5-a772-3507a6dde6ab] Received event network-vif-plugged-15758bd1-cf67-4bc6-9408-9740cd79d26d {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 13:59:25 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-c434ff41-92fa-40e6-93a5-1fc0933f1d4e req-30c75858-3f81-40a7-9c34-e5ebe0986d0b service nova] Acquiring lock "2ae07df3-4bf4-44a5-a772-3507a6dde6ab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:59:25 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-c434ff41-92fa-40e6-93a5-1fc0933f1d4e req-30c75858-3f81-40a7-9c34-e5ebe0986d0b service nova] Lock "2ae07df3-4bf4-44a5-a772-3507a6dde6ab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:59:25 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-c434ff41-92fa-40e6-93a5-1fc0933f1d4e req-30c75858-3f81-40a7-9c34-e5ebe0986d0b service nova] Lock "2ae07df3-4bf4-44a5-a772-3507a6dde6ab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:59:25 user nova-compute[71474]: DEBUG nova.compute.manager [req-c434ff41-92fa-40e6-93a5-1fc0933f1d4e req-30c75858-3f81-40a7-9c34-e5ebe0986d0b service nova] [instance: 2ae07df3-4bf4-44a5-a772-3507a6dde6ab] No waiting events found dispatching network-vif-plugged-15758bd1-cf67-4bc6-9408-9740cd79d26d {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 13:59:25 user nova-compute[71474]: WARNING nova.compute.manager [req-c434ff41-92fa-40e6-93a5-1fc0933f1d4e req-30c75858-3f81-40a7-9c34-e5ebe0986d0b service nova] [instance: 2ae07df3-4bf4-44a5-a772-3507a6dde6ab] Received unexpected event network-vif-plugged-15758bd1-cf67-4bc6-9408-9740cd79d26d for instance with vm_state building and task_state spawning. Apr 21 13:59:25 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:59:25 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:59:25 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:59:25 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:59:27 user nova-compute[71474]: DEBUG nova.virt.driver [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Emitting event Resumed> {{(pid=71474) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 13:59:27 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 2ae07df3-4bf4-44a5-a772-3507a6dde6ab] VM Resumed (Lifecycle Event) Apr 21 13:59:27 user nova-compute[71474]: DEBUG nova.compute.manager [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] [instance: 2ae07df3-4bf4-44a5-a772-3507a6dde6ab] Instance event wait completed in 0 seconds for {{(pid=71474) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 21 13:59:27 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] [instance: 2ae07df3-4bf4-44a5-a772-3507a6dde6ab] Guest created on hypervisor {{(pid=71474) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 21 13:59:27 user nova-compute[71474]: INFO nova.virt.libvirt.driver [-] [instance: 2ae07df3-4bf4-44a5-a772-3507a6dde6ab] Instance spawned successfully. Apr 21 13:59:27 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] [instance: 2ae07df3-4bf4-44a5-a772-3507a6dde6ab] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 21 13:59:27 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 2ae07df3-4bf4-44a5-a772-3507a6dde6ab] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 13:59:27 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 2ae07df3-4bf4-44a5-a772-3507a6dde6ab] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71474) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 13:59:27 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] [instance: 2ae07df3-4bf4-44a5-a772-3507a6dde6ab] Found default for hw_cdrom_bus of ide {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 13:59:27 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] [instance: 2ae07df3-4bf4-44a5-a772-3507a6dde6ab] Found default for hw_disk_bus of virtio {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 13:59:27 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] [instance: 2ae07df3-4bf4-44a5-a772-3507a6dde6ab] Found default for hw_input_bus of None {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 13:59:27 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] [instance: 2ae07df3-4bf4-44a5-a772-3507a6dde6ab] Found default for hw_pointer_model of None {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 13:59:27 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] [instance: 2ae07df3-4bf4-44a5-a772-3507a6dde6ab] Found default for hw_video_model of virtio {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 13:59:27 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] [instance: 2ae07df3-4bf4-44a5-a772-3507a6dde6ab] Found default for hw_vif_model of virtio {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 13:59:27 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 2ae07df3-4bf4-44a5-a772-3507a6dde6ab] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 13:59:27 user nova-compute[71474]: DEBUG nova.virt.driver [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Emitting event Started> {{(pid=71474) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 13:59:27 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 2ae07df3-4bf4-44a5-a772-3507a6dde6ab] VM Started (Lifecycle Event) Apr 21 13:59:27 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 2ae07df3-4bf4-44a5-a772-3507a6dde6ab] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 13:59:27 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 2ae07df3-4bf4-44a5-a772-3507a6dde6ab] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71474) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 13:59:27 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 2ae07df3-4bf4-44a5-a772-3507a6dde6ab] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 13:59:27 user nova-compute[71474]: INFO nova.compute.manager [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] [instance: 2ae07df3-4bf4-44a5-a772-3507a6dde6ab] Took 5.52 seconds to spawn the instance on the hypervisor. Apr 21 13:59:27 user nova-compute[71474]: DEBUG nova.compute.manager [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] [instance: 2ae07df3-4bf4-44a5-a772-3507a6dde6ab] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 13:59:27 user nova-compute[71474]: DEBUG nova.compute.manager [req-4cfbdf1a-f4a5-41e0-ae99-fe0194372c7f req-4956e735-b216-4160-9af5-f056414a6baf service nova] [instance: 2ae07df3-4bf4-44a5-a772-3507a6dde6ab] Received event network-vif-plugged-15758bd1-cf67-4bc6-9408-9740cd79d26d {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 13:59:27 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-4cfbdf1a-f4a5-41e0-ae99-fe0194372c7f req-4956e735-b216-4160-9af5-f056414a6baf service nova] Acquiring lock "2ae07df3-4bf4-44a5-a772-3507a6dde6ab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:59:27 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-4cfbdf1a-f4a5-41e0-ae99-fe0194372c7f req-4956e735-b216-4160-9af5-f056414a6baf service nova] Lock "2ae07df3-4bf4-44a5-a772-3507a6dde6ab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:59:27 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-4cfbdf1a-f4a5-41e0-ae99-fe0194372c7f req-4956e735-b216-4160-9af5-f056414a6baf service nova] Lock "2ae07df3-4bf4-44a5-a772-3507a6dde6ab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:59:27 user nova-compute[71474]: DEBUG nova.compute.manager [req-4cfbdf1a-f4a5-41e0-ae99-fe0194372c7f req-4956e735-b216-4160-9af5-f056414a6baf service nova] [instance: 2ae07df3-4bf4-44a5-a772-3507a6dde6ab] No waiting events found dispatching network-vif-plugged-15758bd1-cf67-4bc6-9408-9740cd79d26d {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 13:59:27 user nova-compute[71474]: WARNING nova.compute.manager [req-4cfbdf1a-f4a5-41e0-ae99-fe0194372c7f req-4956e735-b216-4160-9af5-f056414a6baf service nova] [instance: 2ae07df3-4bf4-44a5-a772-3507a6dde6ab] Received unexpected event network-vif-plugged-15758bd1-cf67-4bc6-9408-9740cd79d26d for instance with vm_state building and task_state spawning. Apr 21 13:59:27 user nova-compute[71474]: INFO nova.compute.manager [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] [instance: 2ae07df3-4bf4-44a5-a772-3507a6dde6ab] Took 6.21 seconds to build instance. Apr 21 13:59:27 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-24e973b0-80b8-4b41-9408-f21918bcef13 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] Lock "2ae07df3-4bf4-44a5-a772-3507a6dde6ab" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.299s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:59:29 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:59:30 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:59:34 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:59:35 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:59:39 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:59:40 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:59:44 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:59:45 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:59:49 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:59:49 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 13:59:50 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 13:59:50 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:59:51 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 13:59:51 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Starting heal instance info cache {{(pid=71474) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 21 13:59:51 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquiring lock "refresh_cache-30068c4a-94ed-4b84-9178-0d554326fc68" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 13:59:51 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquired lock "refresh_cache-30068c4a-94ed-4b84-9178-0d554326fc68" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 13:59:51 user nova-compute[71474]: DEBUG nova.network.neutron [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] Forcefully refreshing network info cache for instance {{(pid=71474) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 21 13:59:52 user nova-compute[71474]: DEBUG nova.network.neutron [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] Updating instance_info_cache with network_info: [{"id": "7361228d-9a8e-4921-9cb8-fc59a0a45063", "address": "fa:16:3e:3c:01:3d", "network": {"id": "d567294b-c36b-4268-af90-17560e0c43e4", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1033838809-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a8c210480b33473c91156b798bcbd8b2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7361228d-9a", "ovs_interfaceid": "7361228d-9a8e-4921-9cb8-fc59a0a45063", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 13:59:52 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Releasing lock "refresh_cache-30068c4a-94ed-4b84-9178-0d554326fc68" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 13:59:52 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] Updated the network info_cache for instance {{(pid=71474) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 21 13:59:52 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 13:59:52 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 13:59:52 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 13:59:52 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 13:59:52 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71474) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 21 13:59:52 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 13:59:52 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:59:52 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:59:52 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:59:52 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Auditing locally available compute resources for user (node: user) {{(pid=71474) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 21 13:59:52 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2ae07df3-4bf4-44a5-a772-3507a6dde6ab/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:59:53 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2ae07df3-4bf4-44a5-a772-3507a6dde6ab/disk --force-share --output=json" returned: 0 in 0.150s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:59:53 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2ae07df3-4bf4-44a5-a772-3507a6dde6ab/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:59:53 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2ae07df3-4bf4-44a5-a772-3507a6dde6ab/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:59:53 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/3af27bc9-9617-44c7-bfa4-993b347d183c/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:59:53 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/3af27bc9-9617-44c7-bfa4-993b347d183c/disk --force-share --output=json" returned: 0 in 0.146s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:59:53 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/3af27bc9-9617-44c7-bfa4-993b347d183c/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:59:53 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/3af27bc9-9617-44c7-bfa4-993b347d183c/disk --force-share --output=json" returned: 0 in 0.137s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:59:53 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5030decd-cbe5-4495-b497-dfacf25eef73/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:59:53 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5030decd-cbe5-4495-b497-dfacf25eef73/disk --force-share --output=json" returned: 0 in 0.126s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:59:53 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5030decd-cbe5-4495-b497-dfacf25eef73/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:59:53 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5030decd-cbe5-4495-b497-dfacf25eef73/disk --force-share --output=json" returned: 0 in 0.134s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:59:53 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/90591d9b-6d6b-4f22-a3dc-fd83044df26b/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:59:53 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/90591d9b-6d6b-4f22-a3dc-fd83044df26b/disk --force-share --output=json" returned: 0 in 0.130s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:59:53 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/90591d9b-6d6b-4f22-a3dc-fd83044df26b/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:59:54 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:59:54 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/90591d9b-6d6b-4f22-a3dc-fd83044df26b/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:59:54 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/0346fbd8-64cd-45e7-906f-e00eeece91ce/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:59:54 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/0346fbd8-64cd-45e7-906f-e00eeece91ce/disk --force-share --output=json" returned: 0 in 0.137s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:59:54 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/0346fbd8-64cd-45e7-906f-e00eeece91ce/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:59:54 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/0346fbd8-64cd-45e7-906f-e00eeece91ce/disk --force-share --output=json" returned: 0 in 0.131s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:59:54 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2c5afe45-87ae-477a-8bf0-6a5e2036fb68/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:59:54 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2c5afe45-87ae-477a-8bf0-6a5e2036fb68/disk --force-share --output=json" returned: 0 in 0.128s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:59:54 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2c5afe45-87ae-477a-8bf0-6a5e2036fb68/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:59:54 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2c5afe45-87ae-477a-8bf0-6a5e2036fb68/disk --force-share --output=json" returned: 0 in 0.131s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:59:54 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5e502c4c-a46b-4670-acba-2fda2d05adf5/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:59:54 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5e502c4c-a46b-4670-acba-2fda2d05adf5/disk --force-share --output=json" returned: 0 in 0.142s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:59:54 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5e502c4c-a46b-4670-acba-2fda2d05adf5/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:59:54 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5e502c4c-a46b-4670-acba-2fda2d05adf5/disk --force-share --output=json" returned: 0 in 0.134s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:59:54 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/30068c4a-94ed-4b84-9178-0d554326fc68/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:59:55 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/30068c4a-94ed-4b84-9178-0d554326fc68/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:59:55 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/30068c4a-94ed-4b84-9178-0d554326fc68/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:59:55 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/30068c4a-94ed-4b84-9178-0d554326fc68/disk --force-share --output=json" returned: 0 in 0.143s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:59:55 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f0f32b68-6993-4843-bcc6-bd0e06377b27/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:59:55 user nova-compute[71474]: DEBUG nova.compute.manager [req-167b475a-c901-4188-90c8-0523071cb637 req-114f30a0-8919-47e3-9e39-2fe804ce950e service nova] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] Received event network-changed-ed62554b-cbc2-4c0f-ad1a-821a0625a2e4 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 13:59:55 user nova-compute[71474]: DEBUG nova.compute.manager [req-167b475a-c901-4188-90c8-0523071cb637 req-114f30a0-8919-47e3-9e39-2fe804ce950e service nova] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] Refreshing instance network info cache due to event network-changed-ed62554b-cbc2-4c0f-ad1a-821a0625a2e4. {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 21 13:59:55 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-167b475a-c901-4188-90c8-0523071cb637 req-114f30a0-8919-47e3-9e39-2fe804ce950e service nova] Acquiring lock "refresh_cache-5030decd-cbe5-4495-b497-dfacf25eef73" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 13:59:55 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-167b475a-c901-4188-90c8-0523071cb637 req-114f30a0-8919-47e3-9e39-2fe804ce950e service nova] Acquired lock "refresh_cache-5030decd-cbe5-4495-b497-dfacf25eef73" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 13:59:55 user nova-compute[71474]: DEBUG nova.network.neutron [req-167b475a-c901-4188-90c8-0523071cb637 req-114f30a0-8919-47e3-9e39-2fe804ce950e service nova] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] Refreshing network info cache for port ed62554b-cbc2-4c0f-ad1a-821a0625a2e4 {{(pid=71474) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 21 13:59:55 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f0f32b68-6993-4843-bcc6-bd0e06377b27/disk --force-share --output=json" returned: 0 in 0.147s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:59:55 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f0f32b68-6993-4843-bcc6-bd0e06377b27/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 13:59:55 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f0f32b68-6993-4843-bcc6-bd0e06377b27/disk --force-share --output=json" returned: 0 in 0.126s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 13:59:56 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:59:56 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 13:59:56 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 13:59:56 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Hypervisor/Node resource view: name=user free_ram=7954MB free_disk=26.08350372314453GB free_vcpus=3 pci_devices=[{"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71474) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 21 13:59:56 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:59:56 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:59:56 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Instance 5030decd-cbe5-4495-b497-dfacf25eef73 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71474) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 13:59:56 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Instance 30068c4a-94ed-4b84-9178-0d554326fc68 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71474) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 13:59:56 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Instance 3af27bc9-9617-44c7-bfa4-993b347d183c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71474) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 13:59:56 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Instance 2c5afe45-87ae-477a-8bf0-6a5e2036fb68 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71474) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 13:59:56 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Instance 0346fbd8-64cd-45e7-906f-e00eeece91ce actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71474) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 13:59:56 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Instance f0f32b68-6993-4843-bcc6-bd0e06377b27 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71474) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 13:59:56 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Instance 5e502c4c-a46b-4670-acba-2fda2d05adf5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71474) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 13:59:56 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Instance 90591d9b-6d6b-4f22-a3dc-fd83044df26b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71474) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 13:59:56 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Instance 2ae07df3-4bf4-44a5-a772-3507a6dde6ab actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71474) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 13:59:56 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Total usable vcpus: 12, total allocated vcpus: 9 {{(pid=71474) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 21 13:59:56 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Final resource view: name=user phys_ram=16023MB used_ram=1664MB phys_disk=40GB used_disk=9GB total_vcpus=12 used_vcpus=9 pci_stats=[] {{(pid=71474) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 21 13:59:56 user nova-compute[71474]: DEBUG nova.network.neutron [req-167b475a-c901-4188-90c8-0523071cb637 req-114f30a0-8919-47e3-9e39-2fe804ce950e service nova] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] Updated VIF entry in instance network info cache for port ed62554b-cbc2-4c0f-ad1a-821a0625a2e4. {{(pid=71474) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 21 13:59:56 user nova-compute[71474]: DEBUG nova.network.neutron [req-167b475a-c901-4188-90c8-0523071cb637 req-114f30a0-8919-47e3-9e39-2fe804ce950e service nova] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] Updating instance_info_cache with network_info: [{"id": "ed62554b-cbc2-4c0f-ad1a-821a0625a2e4", "address": "fa:16:3e:e2:cc:bd", "network": {"id": "23a0f330-371d-4fe5-befe-bc4147bf09c7", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-656541543-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.116", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "91f5972380fd48eabffd46e6727239ce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "taped62554b-cb", "ovs_interfaceid": "ed62554b-cbc2-4c0f-ad1a-821a0625a2e4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 13:59:56 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-167b475a-c901-4188-90c8-0523071cb637 req-114f30a0-8919-47e3-9e39-2fe804ce950e service nova] Releasing lock "refresh_cache-5030decd-cbe5-4495-b497-dfacf25eef73" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 13:59:56 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Inventory has not changed in ProviderTree for provider: 4e62c1ab-67bb-43ed-8389-61deb50e98d7 {{(pid=71474) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 13:59:56 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Inventory has not changed for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 13:59:56 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Compute_service record updated for user:user {{(pid=71474) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 21 13:59:56 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.429s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:59:57 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-c4f11915-a906-4840-8ced-dcca73a601c9 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Acquiring lock "5030decd-cbe5-4495-b497-dfacf25eef73" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:59:57 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-c4f11915-a906-4840-8ced-dcca73a601c9 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Lock "5030decd-cbe5-4495-b497-dfacf25eef73" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.002s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:59:57 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-c4f11915-a906-4840-8ced-dcca73a601c9 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Acquiring lock "5030decd-cbe5-4495-b497-dfacf25eef73-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:59:57 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-c4f11915-a906-4840-8ced-dcca73a601c9 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Lock "5030decd-cbe5-4495-b497-dfacf25eef73-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:59:57 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-c4f11915-a906-4840-8ced-dcca73a601c9 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Lock "5030decd-cbe5-4495-b497-dfacf25eef73-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:59:57 user nova-compute[71474]: INFO nova.compute.manager [None req-c4f11915-a906-4840-8ced-dcca73a601c9 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] Terminating instance Apr 21 13:59:57 user nova-compute[71474]: DEBUG nova.compute.manager [None req-c4f11915-a906-4840-8ced-dcca73a601c9 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] Start destroying the instance on the hypervisor. {{(pid=71474) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 21 13:59:57 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:59:57 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:59:57 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:59:57 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:59:57 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:59:57 user nova-compute[71474]: DEBUG nova.compute.manager [req-a33506a8-9e3e-49f5-ba91-00730b26b208 req-c346b146-d722-4efa-9424-6dce98b6622d service nova] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] Received event network-vif-unplugged-ed62554b-cbc2-4c0f-ad1a-821a0625a2e4 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 13:59:57 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-a33506a8-9e3e-49f5-ba91-00730b26b208 req-c346b146-d722-4efa-9424-6dce98b6622d service nova] Acquiring lock "5030decd-cbe5-4495-b497-dfacf25eef73-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:59:57 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-a33506a8-9e3e-49f5-ba91-00730b26b208 req-c346b146-d722-4efa-9424-6dce98b6622d service nova] Lock "5030decd-cbe5-4495-b497-dfacf25eef73-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:59:57 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-a33506a8-9e3e-49f5-ba91-00730b26b208 req-c346b146-d722-4efa-9424-6dce98b6622d service nova] Lock "5030decd-cbe5-4495-b497-dfacf25eef73-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:59:57 user nova-compute[71474]: DEBUG nova.compute.manager [req-a33506a8-9e3e-49f5-ba91-00730b26b208 req-c346b146-d722-4efa-9424-6dce98b6622d service nova] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] No waiting events found dispatching network-vif-unplugged-ed62554b-cbc2-4c0f-ad1a-821a0625a2e4 {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 13:59:57 user nova-compute[71474]: DEBUG nova.compute.manager [req-a33506a8-9e3e-49f5-ba91-00730b26b208 req-c346b146-d722-4efa-9424-6dce98b6622d service nova] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] Received event network-vif-unplugged-ed62554b-cbc2-4c0f-ad1a-821a0625a2e4 for instance with task_state deleting. {{(pid=71474) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 21 13:59:57 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 13:59:57 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 13:59:57 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:59:57 user nova-compute[71474]: INFO nova.virt.libvirt.driver [-] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] Instance destroyed successfully. Apr 21 13:59:57 user nova-compute[71474]: DEBUG nova.objects.instance [None req-c4f11915-a906-4840-8ced-dcca73a601c9 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Lazy-loading 'resources' on Instance uuid 5030decd-cbe5-4495-b497-dfacf25eef73 {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 13:59:57 user nova-compute[71474]: DEBUG nova.virt.libvirt.vif [None req-c4f11915-a906-4840-8ced-dcca73a601c9 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T13:58:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-872679092',display_name='tempest-AttachVolumeTestJSON-server-872679092',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumetestjson-server-872679092',id=1,image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKmS+b9xlwFpk/ZB8qhPKYszGUSbkqS/wmxdPA2+EZTBrrlLdczt4kqaoNpF+PGGMbYhMksCR0Bk+16nj7FF3bWR1LjQ0yuktGjeeohbe83sc0eREByhabKclSuVNg5vfQ==',key_name='tempest-keypair-1234083121',keypairs=,launch_index=0,launched_at=2023-04-21T13:58:18Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='91f5972380fd48eabffd46e6727239ce',ramdisk_id='',reservation_id='r-tfwpbn06',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-AttachVolumeTestJSON-1194238008',owner_user_name='tempest-AttachVolumeTestJSON-1194238008-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-21T13:58:19Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='113884844de14ec7ac8a20ba06a389b3',uuid=5030decd-cbe5-4495-b497-dfacf25eef73,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ed62554b-cbc2-4c0f-ad1a-821a0625a2e4", "address": "fa:16:3e:e2:cc:bd", "network": {"id": "23a0f330-371d-4fe5-befe-bc4147bf09c7", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-656541543-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.116", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "91f5972380fd48eabffd46e6727239ce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "taped62554b-cb", "ovs_interfaceid": "ed62554b-cbc2-4c0f-ad1a-821a0625a2e4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 21 13:59:57 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-c4f11915-a906-4840-8ced-dcca73a601c9 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Converting VIF {"id": "ed62554b-cbc2-4c0f-ad1a-821a0625a2e4", "address": "fa:16:3e:e2:cc:bd", "network": {"id": "23a0f330-371d-4fe5-befe-bc4147bf09c7", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-656541543-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.116", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "91f5972380fd48eabffd46e6727239ce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "taped62554b-cb", "ovs_interfaceid": "ed62554b-cbc2-4c0f-ad1a-821a0625a2e4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 13:59:57 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-c4f11915-a906-4840-8ced-dcca73a601c9 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e2:cc:bd,bridge_name='br-int',has_traffic_filtering=True,id=ed62554b-cbc2-4c0f-ad1a-821a0625a2e4,network=Network(23a0f330-371d-4fe5-befe-bc4147bf09c7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='taped62554b-cb') {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 13:59:57 user nova-compute[71474]: DEBUG os_vif [None req-c4f11915-a906-4840-8ced-dcca73a601c9 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e2:cc:bd,bridge_name='br-int',has_traffic_filtering=True,id=ed62554b-cbc2-4c0f-ad1a-821a0625a2e4,network=Network(23a0f330-371d-4fe5-befe-bc4147bf09c7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='taped62554b-cb') {{(pid=71474) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 21 13:59:57 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:59:57 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=taped62554b-cb, bridge=br-int, if_exists=True) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 13:59:57 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 13:59:57 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 13:59:57 user nova-compute[71474]: INFO os_vif [None req-c4f11915-a906-4840-8ced-dcca73a601c9 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e2:cc:bd,bridge_name='br-int',has_traffic_filtering=True,id=ed62554b-cbc2-4c0f-ad1a-821a0625a2e4,network=Network(23a0f330-371d-4fe5-befe-bc4147bf09c7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='taped62554b-cb') Apr 21 13:59:57 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-c4f11915-a906-4840-8ced-dcca73a601c9 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] Deleting instance files /opt/stack/data/nova/instances/5030decd-cbe5-4495-b497-dfacf25eef73_del Apr 21 13:59:57 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-c4f11915-a906-4840-8ced-dcca73a601c9 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] Deletion of /opt/stack/data/nova/instances/5030decd-cbe5-4495-b497-dfacf25eef73_del complete Apr 21 13:59:58 user nova-compute[71474]: DEBUG nova.virt.libvirt.host [None req-c4f11915-a906-4840-8ced-dcca73a601c9 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Checking UEFI support for host arch (x86_64) {{(pid=71474) supports_uefi /opt/stack/nova/nova/virt/libvirt/host.py:1722}} Apr 21 13:59:58 user nova-compute[71474]: INFO nova.virt.libvirt.host [None req-c4f11915-a906-4840-8ced-dcca73a601c9 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] UEFI support detected Apr 21 13:59:58 user nova-compute[71474]: INFO nova.compute.manager [None req-c4f11915-a906-4840-8ced-dcca73a601c9 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] Took 0.91 seconds to destroy the instance on the hypervisor. Apr 21 13:59:58 user nova-compute[71474]: DEBUG oslo.service.loopingcall [None req-c4f11915-a906-4840-8ced-dcca73a601c9 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71474) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 21 13:59:58 user nova-compute[71474]: DEBUG nova.compute.manager [-] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] Deallocating network for instance {{(pid=71474) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 21 13:59:58 user nova-compute[71474]: DEBUG nova.network.neutron [-] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] deallocate_for_instance() {{(pid=71474) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 21 13:59:59 user nova-compute[71474]: DEBUG nova.network.neutron [-] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] Updating instance_info_cache with network_info: [] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 13:59:59 user nova-compute[71474]: INFO nova.compute.manager [-] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] Took 0.96 seconds to deallocate network for instance. Apr 21 13:59:59 user nova-compute[71474]: DEBUG nova.compute.manager [req-5f5ceafd-d108-4c6e-801c-c5410b4dd8dc req-05f55f2f-e32a-45e2-81e5-d1cba93dfe76 service nova] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] Received event network-vif-deleted-ed62554b-cbc2-4c0f-ad1a-821a0625a2e4 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 13:59:59 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-c4f11915-a906-4840-8ced-dcca73a601c9 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:59:59 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-c4f11915-a906-4840-8ced-dcca73a601c9 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:59:59 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-c4f11915-a906-4840-8ced-dcca73a601c9 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Inventory has not changed in ProviderTree for provider: 4e62c1ab-67bb-43ed-8389-61deb50e98d7 {{(pid=71474) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 13:59:59 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-c4f11915-a906-4840-8ced-dcca73a601c9 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Inventory has not changed for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 13:59:59 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Acquiring lock "4f8622ba-dea6-454f-90c8-1f5f6a56e0b4" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:59:59 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Lock "4f8622ba-dea6-454f-90c8-1f5f6a56e0b4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:59:59 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-c4f11915-a906-4840-8ced-dcca73a601c9 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.311s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:59:59 user nova-compute[71474]: DEBUG nova.compute.manager [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] [instance: 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4] Starting instance... {{(pid=71474) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 21 13:59:59 user nova-compute[71474]: DEBUG nova.compute.manager [req-ba0cab32-9c77-4d9d-86ae-dae006a8cdb1 req-c9ec8107-580f-4d6e-88ec-82eec995e174 service nova] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] Received event network-vif-plugged-ed62554b-cbc2-4c0f-ad1a-821a0625a2e4 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 13:59:59 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-ba0cab32-9c77-4d9d-86ae-dae006a8cdb1 req-c9ec8107-580f-4d6e-88ec-82eec995e174 service nova] Acquiring lock "5030decd-cbe5-4495-b497-dfacf25eef73-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:59:59 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-ba0cab32-9c77-4d9d-86ae-dae006a8cdb1 req-c9ec8107-580f-4d6e-88ec-82eec995e174 service nova] Lock "5030decd-cbe5-4495-b497-dfacf25eef73-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:59:59 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-ba0cab32-9c77-4d9d-86ae-dae006a8cdb1 req-c9ec8107-580f-4d6e-88ec-82eec995e174 service nova] Lock "5030decd-cbe5-4495-b497-dfacf25eef73-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:59:59 user nova-compute[71474]: DEBUG nova.compute.manager [req-ba0cab32-9c77-4d9d-86ae-dae006a8cdb1 req-c9ec8107-580f-4d6e-88ec-82eec995e174 service nova] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] No waiting events found dispatching network-vif-plugged-ed62554b-cbc2-4c0f-ad1a-821a0625a2e4 {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 13:59:59 user nova-compute[71474]: WARNING nova.compute.manager [req-ba0cab32-9c77-4d9d-86ae-dae006a8cdb1 req-c9ec8107-580f-4d6e-88ec-82eec995e174 service nova] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] Received unexpected event network-vif-plugged-ed62554b-cbc2-4c0f-ad1a-821a0625a2e4 for instance with vm_state deleted and task_state None. Apr 21 13:59:59 user nova-compute[71474]: INFO nova.scheduler.client.report [None req-c4f11915-a906-4840-8ced-dcca73a601c9 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Deleted allocations for instance 5030decd-cbe5-4495-b497-dfacf25eef73 Apr 21 13:59:59 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 13:59:59 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 13:59:59 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71474) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 21 13:59:59 user nova-compute[71474]: INFO nova.compute.claims [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] [instance: 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4] Claim successful on node user Apr 21 13:59:59 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-c4f11915-a906-4840-8ced-dcca73a601c9 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Lock "5030decd-cbe5-4495-b497-dfacf25eef73" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.409s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:59:59 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Inventory has not changed in ProviderTree for provider: 4e62c1ab-67bb-43ed-8389-61deb50e98d7 {{(pid=71474) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 13:59:59 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Inventory has not changed for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 13:59:59 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.390s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 13:59:59 user nova-compute[71474]: DEBUG nova.compute.manager [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] [instance: 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4] Start building networks asynchronously for instance. {{(pid=71474) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 21 13:59:59 user nova-compute[71474]: DEBUG nova.compute.manager [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] [instance: 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4] Allocating IP information in the background. {{(pid=71474) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 21 13:59:59 user nova-compute[71474]: DEBUG nova.network.neutron [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] [instance: 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4] allocate_for_instance() {{(pid=71474) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 21 13:59:59 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] [instance: 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 21 14:00:00 user nova-compute[71474]: DEBUG nova.compute.manager [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] [instance: 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4] Start building block device mappings for instance. {{(pid=71474) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 21 14:00:00 user nova-compute[71474]: DEBUG nova.policy [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2259f365261c49b28b56ddd1c27c125d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a8c210480b33473c91156b798bcbd8b2', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71474) authorize /opt/stack/nova/nova/policy.py:203}} Apr 21 14:00:00 user nova-compute[71474]: DEBUG nova.compute.manager [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] [instance: 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4] Start spawning the instance on the hypervisor. {{(pid=71474) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 21 14:00:00 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] [instance: 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4] Creating instance directory {{(pid=71474) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 21 14:00:00 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] [instance: 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4] Creating image(s) Apr 21 14:00:00 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Acquiring lock "/opt/stack/data/nova/instances/4f8622ba-dea6-454f-90c8-1f5f6a56e0b4/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:00:00 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Lock "/opt/stack/data/nova/instances/4f8622ba-dea6-454f-90c8-1f5f6a56e0b4/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:00:00 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Lock "/opt/stack/data/nova/instances/4f8622ba-dea6-454f-90c8-1f5f6a56e0b4/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:00:00 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:00:00 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json" returned: 0 in 0.138s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:00:00 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Acquiring lock "8e8c288cb98f22f6af31ad55f38b7baa81c260d7" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:00:00 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Lock "8e8c288cb98f22f6af31ad55f38b7baa81c260d7" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:00:00 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:00:00 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json" returned: 0 in 0.133s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:00:00 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7,backing_fmt=raw /opt/stack/data/nova/instances/4f8622ba-dea6-454f-90c8-1f5f6a56e0b4/disk 1073741824 {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:00:00 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7,backing_fmt=raw /opt/stack/data/nova/instances/4f8622ba-dea6-454f-90c8-1f5f6a56e0b4/disk 1073741824" returned: 0 in 0.103s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:00:00 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Lock "8e8c288cb98f22f6af31ad55f38b7baa81c260d7" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.242s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:00:00 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:00:00 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json" returned: 0 in 0.136s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:00:00 user nova-compute[71474]: DEBUG nova.virt.disk.api [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Checking if we can resize image /opt/stack/data/nova/instances/4f8622ba-dea6-454f-90c8-1f5f6a56e0b4/disk. size=1073741824 {{(pid=71474) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 21 14:00:00 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4f8622ba-dea6-454f-90c8-1f5f6a56e0b4/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:00:00 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4f8622ba-dea6-454f-90c8-1f5f6a56e0b4/disk --force-share --output=json" returned: 0 in 0.126s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:00:00 user nova-compute[71474]: DEBUG nova.virt.disk.api [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Cannot resize image /opt/stack/data/nova/instances/4f8622ba-dea6-454f-90c8-1f5f6a56e0b4/disk to a smaller size. {{(pid=71474) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 21 14:00:00 user nova-compute[71474]: DEBUG nova.objects.instance [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Lazy-loading 'migration_context' on Instance uuid 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4 {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 14:00:00 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:00:00 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] [instance: 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4] Created local disks {{(pid=71474) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 21 14:00:00 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] [instance: 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4] Ensure instance console log exists: /opt/stack/data/nova/instances/4f8622ba-dea6-454f-90c8-1f5f6a56e0b4/console.log {{(pid=71474) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 21 14:00:00 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:00:00 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:00:00 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:00:00 user nova-compute[71474]: DEBUG nova.network.neutron [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] [instance: 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4] Successfully created port: 9d4913bf-46f1-4c09-a062-803300bbed23 {{(pid=71474) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 21 14:00:01 user nova-compute[71474]: DEBUG nova.network.neutron [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] [instance: 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4] Successfully updated port: 9d4913bf-46f1-4c09-a062-803300bbed23 {{(pid=71474) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 21 14:00:01 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Acquiring lock "refresh_cache-4f8622ba-dea6-454f-90c8-1f5f6a56e0b4" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 14:00:01 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Acquired lock "refresh_cache-4f8622ba-dea6-454f-90c8-1f5f6a56e0b4" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 14:00:01 user nova-compute[71474]: DEBUG nova.network.neutron [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] [instance: 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4] Building network info cache for instance {{(pid=71474) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 21 14:00:01 user nova-compute[71474]: DEBUG nova.compute.manager [req-30931b37-0ba1-4300-9804-71b1f6b8847c req-4200a501-d343-4249-a59e-e42358e06751 service nova] [instance: 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4] Received event network-changed-9d4913bf-46f1-4c09-a062-803300bbed23 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:00:01 user nova-compute[71474]: DEBUG nova.compute.manager [req-30931b37-0ba1-4300-9804-71b1f6b8847c req-4200a501-d343-4249-a59e-e42358e06751 service nova] [instance: 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4] Refreshing instance network info cache due to event network-changed-9d4913bf-46f1-4c09-a062-803300bbed23. {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 21 14:00:01 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-30931b37-0ba1-4300-9804-71b1f6b8847c req-4200a501-d343-4249-a59e-e42358e06751 service nova] Acquiring lock "refresh_cache-4f8622ba-dea6-454f-90c8-1f5f6a56e0b4" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 14:00:01 user nova-compute[71474]: DEBUG nova.network.neutron [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] [instance: 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4] Instance cache missing network info. {{(pid=71474) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 21 14:00:01 user nova-compute[71474]: DEBUG nova.network.neutron [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] [instance: 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4] Updating instance_info_cache with network_info: [{"id": "9d4913bf-46f1-4c09-a062-803300bbed23", "address": "fa:16:3e:eb:54:b2", "network": {"id": "d567294b-c36b-4268-af90-17560e0c43e4", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1033838809-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a8c210480b33473c91156b798bcbd8b2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d4913bf-46", "ovs_interfaceid": "9d4913bf-46f1-4c09-a062-803300bbed23", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:00:01 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Releasing lock "refresh_cache-4f8622ba-dea6-454f-90c8-1f5f6a56e0b4" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 14:00:01 user nova-compute[71474]: DEBUG nova.compute.manager [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] [instance: 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4] Instance network_info: |[{"id": "9d4913bf-46f1-4c09-a062-803300bbed23", "address": "fa:16:3e:eb:54:b2", "network": {"id": "d567294b-c36b-4268-af90-17560e0c43e4", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1033838809-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a8c210480b33473c91156b798bcbd8b2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d4913bf-46", "ovs_interfaceid": "9d4913bf-46f1-4c09-a062-803300bbed23", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71474) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 21 14:00:01 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-30931b37-0ba1-4300-9804-71b1f6b8847c req-4200a501-d343-4249-a59e-e42358e06751 service nova] Acquired lock "refresh_cache-4f8622ba-dea6-454f-90c8-1f5f6a56e0b4" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 14:00:01 user nova-compute[71474]: DEBUG nova.network.neutron [req-30931b37-0ba1-4300-9804-71b1f6b8847c req-4200a501-d343-4249-a59e-e42358e06751 service nova] [instance: 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4] Refreshing network info cache for port 9d4913bf-46f1-4c09-a062-803300bbed23 {{(pid=71474) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 21 14:00:01 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] [instance: 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4] Start _get_guest_xml network_info=[{"id": "9d4913bf-46f1-4c09-a062-803300bbed23", "address": "fa:16:3e:eb:54:b2", "network": {"id": "d567294b-c36b-4268-af90-17560e0c43e4", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1033838809-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a8c210480b33473c91156b798bcbd8b2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d4913bf-46", "ovs_interfaceid": "9d4913bf-46f1-4c09-a062-803300bbed23", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T13:54:16Z,direct_url=,disk_format='qcow2',id=2edfef44-2867-4e03-a53e-b139f99afa75,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='36a44032fda748c1965c722304fa176d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T13:54:18Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'guest_format': None, 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'image_id': '2edfef44-2867-4e03-a53e-b139f99afa75'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71474) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 21 14:00:01 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 14:00:01 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 14:00:01 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71474) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 21 14:00:01 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-21T13:55:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T13:54:16Z,direct_url=,disk_format='qcow2',id=2edfef44-2867-4e03-a53e-b139f99afa75,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='36a44032fda748c1965c722304fa176d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T13:54:18Z,virtual_size=,visibility=), allow threads: True {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 21 14:00:01 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Flavor limits 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 21 14:00:01 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Image limits 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 21 14:00:01 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Flavor pref 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 21 14:00:01 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Image pref 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 21 14:00:01 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 21 14:00:01 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 21 14:00:01 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71474) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 21 14:00:01 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Got 1 possible topologies {{(pid=71474) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 21 14:00:01 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 21 14:00:01 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 21 14:00:02 user nova-compute[71474]: DEBUG nova.virt.libvirt.vif [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T13:59:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-911731909',display_name='tempest-ServersNegativeTestJSON-server-911731909',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serversnegativetestjson-server-911731909',id=10,image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a8c210480b33473c91156b798bcbd8b2',ramdisk_id='',reservation_id='r-z0r56r24',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-1552178734',owner_user_name='tempest-ServersNegativeTestJSON-1552178734-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T14:00:00Z,user_data=None,user_id='2259f365261c49b28b56ddd1c27c125d',uuid=4f8622ba-dea6-454f-90c8-1f5f6a56e0b4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9d4913bf-46f1-4c09-a062-803300bbed23", "address": "fa:16:3e:eb:54:b2", "network": {"id": "d567294b-c36b-4268-af90-17560e0c43e4", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1033838809-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a8c210480b33473c91156b798bcbd8b2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d4913bf-46", "ovs_interfaceid": "9d4913bf-46f1-4c09-a062-803300bbed23", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71474) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 21 14:00:02 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Converting VIF {"id": "9d4913bf-46f1-4c09-a062-803300bbed23", "address": "fa:16:3e:eb:54:b2", "network": {"id": "d567294b-c36b-4268-af90-17560e0c43e4", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1033838809-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a8c210480b33473c91156b798bcbd8b2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d4913bf-46", "ovs_interfaceid": "9d4913bf-46f1-4c09-a062-803300bbed23", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 14:00:02 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:eb:54:b2,bridge_name='br-int',has_traffic_filtering=True,id=9d4913bf-46f1-4c09-a062-803300bbed23,network=Network(d567294b-c36b-4268-af90-17560e0c43e4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9d4913bf-46') {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 14:00:02 user nova-compute[71474]: DEBUG nova.objects.instance [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Lazy-loading 'pci_devices' on Instance uuid 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4 {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 14:00:02 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] [instance: 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4] End _get_guest_xml xml= Apr 21 14:00:02 user nova-compute[71474]: 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4 Apr 21 14:00:02 user nova-compute[71474]: instance-0000000a Apr 21 14:00:02 user nova-compute[71474]: 131072 Apr 21 14:00:02 user nova-compute[71474]: 1 Apr 21 14:00:02 user nova-compute[71474]: Apr 21 14:00:02 user nova-compute[71474]: Apr 21 14:00:02 user nova-compute[71474]: Apr 21 14:00:02 user nova-compute[71474]: tempest-ServersNegativeTestJSON-server-911731909 Apr 21 14:00:02 user nova-compute[71474]: 2023-04-21 14:00:01 Apr 21 14:00:02 user nova-compute[71474]: Apr 21 14:00:02 user nova-compute[71474]: 128 Apr 21 14:00:02 user nova-compute[71474]: 1 Apr 21 14:00:02 user nova-compute[71474]: 0 Apr 21 14:00:02 user nova-compute[71474]: 0 Apr 21 14:00:02 user nova-compute[71474]: 1 Apr 21 14:00:02 user nova-compute[71474]: Apr 21 14:00:02 user nova-compute[71474]: Apr 21 14:00:02 user nova-compute[71474]: tempest-ServersNegativeTestJSON-1552178734-project-member Apr 21 14:00:02 user nova-compute[71474]: tempest-ServersNegativeTestJSON-1552178734 Apr 21 14:00:02 user nova-compute[71474]: Apr 21 14:00:02 user nova-compute[71474]: Apr 21 14:00:02 user nova-compute[71474]: Apr 21 14:00:02 user nova-compute[71474]: Apr 21 14:00:02 user nova-compute[71474]: Apr 21 14:00:02 user nova-compute[71474]: Apr 21 14:00:02 user nova-compute[71474]: Apr 21 14:00:02 user nova-compute[71474]: Apr 21 14:00:02 user nova-compute[71474]: Apr 21 14:00:02 user nova-compute[71474]: Apr 21 14:00:02 user nova-compute[71474]: Apr 21 14:00:02 user nova-compute[71474]: OpenStack Foundation Apr 21 14:00:02 user nova-compute[71474]: OpenStack Nova Apr 21 14:00:02 user nova-compute[71474]: 0.0.0 Apr 21 14:00:02 user nova-compute[71474]: 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4 Apr 21 14:00:02 user nova-compute[71474]: 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4 Apr 21 14:00:02 user nova-compute[71474]: Virtual Machine Apr 21 14:00:02 user nova-compute[71474]: Apr 21 14:00:02 user nova-compute[71474]: Apr 21 14:00:02 user nova-compute[71474]: Apr 21 14:00:02 user nova-compute[71474]: hvm Apr 21 14:00:02 user nova-compute[71474]: Apr 21 14:00:02 user nova-compute[71474]: Apr 21 14:00:02 user nova-compute[71474]: Apr 21 14:00:02 user nova-compute[71474]: Apr 21 14:00:02 user nova-compute[71474]: Apr 21 14:00:02 user nova-compute[71474]: Apr 21 14:00:02 user nova-compute[71474]: Apr 21 14:00:02 user nova-compute[71474]: Apr 21 14:00:02 user nova-compute[71474]: Apr 21 14:00:02 user nova-compute[71474]: Apr 21 14:00:02 user nova-compute[71474]: Apr 21 14:00:02 user nova-compute[71474]: Apr 21 14:00:02 user nova-compute[71474]: Apr 21 14:00:02 user nova-compute[71474]: Apr 21 14:00:02 user nova-compute[71474]: Nehalem Apr 21 14:00:02 user nova-compute[71474]: Apr 21 14:00:02 user nova-compute[71474]: Apr 21 14:00:02 user nova-compute[71474]: Apr 21 14:00:02 user nova-compute[71474]: Apr 21 14:00:02 user nova-compute[71474]: Apr 21 14:00:02 user nova-compute[71474]: Apr 21 14:00:02 user nova-compute[71474]: Apr 21 14:00:02 user nova-compute[71474]: Apr 21 14:00:02 user nova-compute[71474]: Apr 21 14:00:02 user nova-compute[71474]: Apr 21 14:00:02 user nova-compute[71474]: Apr 21 14:00:02 user nova-compute[71474]: Apr 21 14:00:02 user nova-compute[71474]: Apr 21 14:00:02 user nova-compute[71474]: Apr 21 14:00:02 user nova-compute[71474]: Apr 21 14:00:02 user nova-compute[71474]: Apr 21 14:00:02 user nova-compute[71474]: Apr 21 14:00:02 user nova-compute[71474]: Apr 21 14:00:02 user nova-compute[71474]: Apr 21 14:00:02 user nova-compute[71474]: Apr 21 14:00:02 user nova-compute[71474]: /dev/urandom Apr 21 14:00:02 user nova-compute[71474]: Apr 21 14:00:02 user nova-compute[71474]: Apr 21 14:00:02 user nova-compute[71474]: Apr 21 14:00:02 user nova-compute[71474]: Apr 21 14:00:02 user nova-compute[71474]: Apr 21 14:00:02 user nova-compute[71474]: Apr 21 14:00:02 user nova-compute[71474]: Apr 21 14:00:02 user nova-compute[71474]: {{(pid=71474) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 21 14:00:02 user nova-compute[71474]: DEBUG nova.virt.libvirt.vif [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T13:59:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-911731909',display_name='tempest-ServersNegativeTestJSON-server-911731909',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serversnegativetestjson-server-911731909',id=10,image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a8c210480b33473c91156b798bcbd8b2',ramdisk_id='',reservation_id='r-z0r56r24',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-1552178734',owner_user_name='tempest-ServersNegativeTestJSON-1552178734-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T14:00:00Z,user_data=None,user_id='2259f365261c49b28b56ddd1c27c125d',uuid=4f8622ba-dea6-454f-90c8-1f5f6a56e0b4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9d4913bf-46f1-4c09-a062-803300bbed23", "address": "fa:16:3e:eb:54:b2", "network": {"id": "d567294b-c36b-4268-af90-17560e0c43e4", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1033838809-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a8c210480b33473c91156b798bcbd8b2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d4913bf-46", "ovs_interfaceid": "9d4913bf-46f1-4c09-a062-803300bbed23", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 21 14:00:02 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Converting VIF {"id": "9d4913bf-46f1-4c09-a062-803300bbed23", "address": "fa:16:3e:eb:54:b2", "network": {"id": "d567294b-c36b-4268-af90-17560e0c43e4", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1033838809-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a8c210480b33473c91156b798bcbd8b2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d4913bf-46", "ovs_interfaceid": "9d4913bf-46f1-4c09-a062-803300bbed23", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 14:00:02 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:eb:54:b2,bridge_name='br-int',has_traffic_filtering=True,id=9d4913bf-46f1-4c09-a062-803300bbed23,network=Network(d567294b-c36b-4268-af90-17560e0c43e4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9d4913bf-46') {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 14:00:02 user nova-compute[71474]: DEBUG os_vif [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:eb:54:b2,bridge_name='br-int',has_traffic_filtering=True,id=9d4913bf-46f1-4c09-a062-803300bbed23,network=Network(d567294b-c36b-4268-af90-17560e0c43e4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9d4913bf-46') {{(pid=71474) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 21 14:00:02 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:00:02 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 14:00:02 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 21 14:00:02 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:00:02 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9d4913bf-46, may_exist=True) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 14:00:02 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9d4913bf-46, col_values=(('external_ids', {'iface-id': '9d4913bf-46f1-4c09-a062-803300bbed23', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:eb:54:b2', 'vm-uuid': '4f8622ba-dea6-454f-90c8-1f5f6a56e0b4'}),)) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 14:00:02 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:00:02 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 14:00:02 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:00:02 user nova-compute[71474]: INFO os_vif [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:eb:54:b2,bridge_name='br-int',has_traffic_filtering=True,id=9d4913bf-46f1-4c09-a062-803300bbed23,network=Network(d567294b-c36b-4268-af90-17560e0c43e4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9d4913bf-46') Apr 21 14:00:02 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] No BDM found with device name vda, not building metadata. {{(pid=71474) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 21 14:00:02 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] No VIF found with MAC fa:16:3e:eb:54:b2, not building metadata {{(pid=71474) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 21 14:00:02 user nova-compute[71474]: DEBUG nova.network.neutron [req-30931b37-0ba1-4300-9804-71b1f6b8847c req-4200a501-d343-4249-a59e-e42358e06751 service nova] [instance: 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4] Updated VIF entry in instance network info cache for port 9d4913bf-46f1-4c09-a062-803300bbed23. {{(pid=71474) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 21 14:00:02 user nova-compute[71474]: DEBUG nova.network.neutron [req-30931b37-0ba1-4300-9804-71b1f6b8847c req-4200a501-d343-4249-a59e-e42358e06751 service nova] [instance: 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4] Updating instance_info_cache with network_info: [{"id": "9d4913bf-46f1-4c09-a062-803300bbed23", "address": "fa:16:3e:eb:54:b2", "network": {"id": "d567294b-c36b-4268-af90-17560e0c43e4", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1033838809-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a8c210480b33473c91156b798bcbd8b2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d4913bf-46", "ovs_interfaceid": "9d4913bf-46f1-4c09-a062-803300bbed23", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:00:02 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-30931b37-0ba1-4300-9804-71b1f6b8847c req-4200a501-d343-4249-a59e-e42358e06751 service nova] Releasing lock "refresh_cache-4f8622ba-dea6-454f-90c8-1f5f6a56e0b4" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 14:00:03 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:00:03 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:00:03 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:00:03 user nova-compute[71474]: DEBUG nova.compute.manager [req-772ba3dd-559d-46d3-913f-bf163cdb07b9 req-a21eea64-0533-4cb5-b7c1-2958dc231363 service nova] [instance: 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4] Received event network-vif-plugged-9d4913bf-46f1-4c09-a062-803300bbed23 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:00:03 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-772ba3dd-559d-46d3-913f-bf163cdb07b9 req-a21eea64-0533-4cb5-b7c1-2958dc231363 service nova] Acquiring lock "4f8622ba-dea6-454f-90c8-1f5f6a56e0b4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:00:03 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-772ba3dd-559d-46d3-913f-bf163cdb07b9 req-a21eea64-0533-4cb5-b7c1-2958dc231363 service nova] Lock "4f8622ba-dea6-454f-90c8-1f5f6a56e0b4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:00:03 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-772ba3dd-559d-46d3-913f-bf163cdb07b9 req-a21eea64-0533-4cb5-b7c1-2958dc231363 service nova] Lock "4f8622ba-dea6-454f-90c8-1f5f6a56e0b4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:00:03 user nova-compute[71474]: DEBUG nova.compute.manager [req-772ba3dd-559d-46d3-913f-bf163cdb07b9 req-a21eea64-0533-4cb5-b7c1-2958dc231363 service nova] [instance: 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4] No waiting events found dispatching network-vif-plugged-9d4913bf-46f1-4c09-a062-803300bbed23 {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:00:03 user nova-compute[71474]: WARNING nova.compute.manager [req-772ba3dd-559d-46d3-913f-bf163cdb07b9 req-a21eea64-0533-4cb5-b7c1-2958dc231363 service nova] [instance: 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4] Received unexpected event network-vif-plugged-9d4913bf-46f1-4c09-a062-803300bbed23 for instance with vm_state building and task_state spawning. Apr 21 14:00:03 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:00:03 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:00:05 user nova-compute[71474]: DEBUG nova.compute.manager [req-b6d73f1e-714e-4aa2-8983-fc01914ce65e req-9990db50-8e45-462b-9330-228df2e567bd service nova] [instance: 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4] Received event network-vif-plugged-9d4913bf-46f1-4c09-a062-803300bbed23 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:00:05 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-b6d73f1e-714e-4aa2-8983-fc01914ce65e req-9990db50-8e45-462b-9330-228df2e567bd service nova] Acquiring lock "4f8622ba-dea6-454f-90c8-1f5f6a56e0b4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:00:05 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-b6d73f1e-714e-4aa2-8983-fc01914ce65e req-9990db50-8e45-462b-9330-228df2e567bd service nova] Lock "4f8622ba-dea6-454f-90c8-1f5f6a56e0b4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:00:05 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-b6d73f1e-714e-4aa2-8983-fc01914ce65e req-9990db50-8e45-462b-9330-228df2e567bd service nova] Lock "4f8622ba-dea6-454f-90c8-1f5f6a56e0b4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:00:05 user nova-compute[71474]: DEBUG nova.compute.manager [req-b6d73f1e-714e-4aa2-8983-fc01914ce65e req-9990db50-8e45-462b-9330-228df2e567bd service nova] [instance: 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4] No waiting events found dispatching network-vif-plugged-9d4913bf-46f1-4c09-a062-803300bbed23 {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:00:05 user nova-compute[71474]: WARNING nova.compute.manager [req-b6d73f1e-714e-4aa2-8983-fc01914ce65e req-9990db50-8e45-462b-9330-228df2e567bd service nova] [instance: 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4] Received unexpected event network-vif-plugged-9d4913bf-46f1-4c09-a062-803300bbed23 for instance with vm_state building and task_state spawning. Apr 21 14:00:05 user nova-compute[71474]: DEBUG nova.compute.manager [req-b6d73f1e-714e-4aa2-8983-fc01914ce65e req-9990db50-8e45-462b-9330-228df2e567bd service nova] [instance: 3af27bc9-9617-44c7-bfa4-993b347d183c] Received event network-changed-7e465dfa-8ae6-4806-b18d-e23dcbf0a97d {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:00:05 user nova-compute[71474]: DEBUG nova.compute.manager [req-b6d73f1e-714e-4aa2-8983-fc01914ce65e req-9990db50-8e45-462b-9330-228df2e567bd service nova] [instance: 3af27bc9-9617-44c7-bfa4-993b347d183c] Refreshing instance network info cache due to event network-changed-7e465dfa-8ae6-4806-b18d-e23dcbf0a97d. {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 21 14:00:05 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-b6d73f1e-714e-4aa2-8983-fc01914ce65e req-9990db50-8e45-462b-9330-228df2e567bd service nova] Acquiring lock "refresh_cache-3af27bc9-9617-44c7-bfa4-993b347d183c" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 14:00:05 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-b6d73f1e-714e-4aa2-8983-fc01914ce65e req-9990db50-8e45-462b-9330-228df2e567bd service nova] Acquired lock "refresh_cache-3af27bc9-9617-44c7-bfa4-993b347d183c" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 14:00:05 user nova-compute[71474]: DEBUG nova.network.neutron [req-b6d73f1e-714e-4aa2-8983-fc01914ce65e req-9990db50-8e45-462b-9330-228df2e567bd service nova] [instance: 3af27bc9-9617-44c7-bfa4-993b347d183c] Refreshing network info cache for port 7e465dfa-8ae6-4806-b18d-e23dcbf0a97d {{(pid=71474) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 21 14:00:05 user nova-compute[71474]: DEBUG nova.virt.driver [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Emitting event Resumed> {{(pid=71474) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 14:00:05 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4] VM Resumed (Lifecycle Event) Apr 21 14:00:05 user nova-compute[71474]: DEBUG nova.compute.manager [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] [instance: 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4] Instance event wait completed in 0 seconds for {{(pid=71474) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 21 14:00:05 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] [instance: 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4] Guest created on hypervisor {{(pid=71474) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 21 14:00:05 user nova-compute[71474]: INFO nova.virt.libvirt.driver [-] [instance: 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4] Instance spawned successfully. Apr 21 14:00:05 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] [instance: 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 21 14:00:05 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 14:00:05 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71474) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 14:00:05 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] [instance: 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4] Found default for hw_cdrom_bus of ide {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:00:05 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] [instance: 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4] Found default for hw_disk_bus of virtio {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:00:05 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] [instance: 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4] Found default for hw_input_bus of None {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:00:05 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] [instance: 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4] Found default for hw_pointer_model of None {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:00:05 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] [instance: 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4] Found default for hw_video_model of virtio {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:00:05 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] [instance: 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4] Found default for hw_vif_model of virtio {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:00:05 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 14:00:05 user nova-compute[71474]: DEBUG nova.virt.driver [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Emitting event Started> {{(pid=71474) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 14:00:05 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4] VM Started (Lifecycle Event) Apr 21 14:00:05 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 14:00:05 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71474) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 14:00:05 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 14:00:05 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:00:05 user nova-compute[71474]: INFO nova.compute.manager [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] [instance: 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4] Took 5.68 seconds to spawn the instance on the hypervisor. Apr 21 14:00:05 user nova-compute[71474]: DEBUG nova.compute.manager [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] [instance: 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 14:00:05 user nova-compute[71474]: INFO nova.compute.manager [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] [instance: 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4] Took 6.38 seconds to build instance. Apr 21 14:00:05 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f21ca4f8-fbbd-46c9-9803-27be1a10234a tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Lock "4f8622ba-dea6-454f-90c8-1f5f6a56e0b4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.486s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:00:06 user nova-compute[71474]: DEBUG nova.network.neutron [req-b6d73f1e-714e-4aa2-8983-fc01914ce65e req-9990db50-8e45-462b-9330-228df2e567bd service nova] [instance: 3af27bc9-9617-44c7-bfa4-993b347d183c] Updated VIF entry in instance network info cache for port 7e465dfa-8ae6-4806-b18d-e23dcbf0a97d. {{(pid=71474) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 21 14:00:06 user nova-compute[71474]: DEBUG nova.network.neutron [req-b6d73f1e-714e-4aa2-8983-fc01914ce65e req-9990db50-8e45-462b-9330-228df2e567bd service nova] [instance: 3af27bc9-9617-44c7-bfa4-993b347d183c] Updating instance_info_cache with network_info: [{"id": "7e465dfa-8ae6-4806-b18d-e23dcbf0a97d", "address": "fa:16:3e:da:eb:fa", "network": {"id": "1815b48e-38a4-4a83-a23b-d7c2ce38a2c3", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1971948253-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.38", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8a8fedc10f324a92aef4142ab7efdd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e465dfa-8a", "ovs_interfaceid": "7e465dfa-8ae6-4806-b18d-e23dcbf0a97d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:00:06 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-b6d73f1e-714e-4aa2-8983-fc01914ce65e req-9990db50-8e45-462b-9330-228df2e567bd service nova] Releasing lock "refresh_cache-3af27bc9-9617-44c7-bfa4-993b347d183c" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 14:00:06 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-57a368e6-073e-42b3-a8d7-ca128f00c847 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Acquiring lock "3af27bc9-9617-44c7-bfa4-993b347d183c" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:00:06 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-57a368e6-073e-42b3-a8d7-ca128f00c847 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Lock "3af27bc9-9617-44c7-bfa4-993b347d183c" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:00:06 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-57a368e6-073e-42b3-a8d7-ca128f00c847 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Acquiring lock "3af27bc9-9617-44c7-bfa4-993b347d183c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:00:06 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-57a368e6-073e-42b3-a8d7-ca128f00c847 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Lock "3af27bc9-9617-44c7-bfa4-993b347d183c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:00:06 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-57a368e6-073e-42b3-a8d7-ca128f00c847 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Lock "3af27bc9-9617-44c7-bfa4-993b347d183c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:00:06 user nova-compute[71474]: INFO nova.compute.manager [None req-57a368e6-073e-42b3-a8d7-ca128f00c847 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] [instance: 3af27bc9-9617-44c7-bfa4-993b347d183c] Terminating instance Apr 21 14:00:06 user nova-compute[71474]: DEBUG nova.compute.manager [None req-57a368e6-073e-42b3-a8d7-ca128f00c847 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] [instance: 3af27bc9-9617-44c7-bfa4-993b347d183c] Start destroying the instance on the hypervisor. {{(pid=71474) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 21 14:00:06 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:00:07 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:00:07 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:00:07 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:00:07 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:00:07 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:00:07 user nova-compute[71474]: DEBUG nova.compute.manager [req-a584e436-5f77-4e9d-9720-e8cd8c12a115 req-7838eb0f-0194-4bbc-954b-d7a83e418f89 service nova] [instance: 3af27bc9-9617-44c7-bfa4-993b347d183c] Received event network-vif-unplugged-7e465dfa-8ae6-4806-b18d-e23dcbf0a97d {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:00:07 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-a584e436-5f77-4e9d-9720-e8cd8c12a115 req-7838eb0f-0194-4bbc-954b-d7a83e418f89 service nova] Acquiring lock "3af27bc9-9617-44c7-bfa4-993b347d183c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:00:07 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-a584e436-5f77-4e9d-9720-e8cd8c12a115 req-7838eb0f-0194-4bbc-954b-d7a83e418f89 service nova] Lock "3af27bc9-9617-44c7-bfa4-993b347d183c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:00:07 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-a584e436-5f77-4e9d-9720-e8cd8c12a115 req-7838eb0f-0194-4bbc-954b-d7a83e418f89 service nova] Lock "3af27bc9-9617-44c7-bfa4-993b347d183c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:00:07 user nova-compute[71474]: DEBUG nova.compute.manager [req-a584e436-5f77-4e9d-9720-e8cd8c12a115 req-7838eb0f-0194-4bbc-954b-d7a83e418f89 service nova] [instance: 3af27bc9-9617-44c7-bfa4-993b347d183c] No waiting events found dispatching network-vif-unplugged-7e465dfa-8ae6-4806-b18d-e23dcbf0a97d {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:00:07 user nova-compute[71474]: DEBUG nova.compute.manager [req-a584e436-5f77-4e9d-9720-e8cd8c12a115 req-7838eb0f-0194-4bbc-954b-d7a83e418f89 service nova] [instance: 3af27bc9-9617-44c7-bfa4-993b347d183c] Received event network-vif-unplugged-7e465dfa-8ae6-4806-b18d-e23dcbf0a97d for instance with task_state deleting. {{(pid=71474) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 21 14:00:07 user nova-compute[71474]: INFO nova.virt.libvirt.driver [-] [instance: 3af27bc9-9617-44c7-bfa4-993b347d183c] Instance destroyed successfully. Apr 21 14:00:07 user nova-compute[71474]: DEBUG nova.objects.instance [None req-57a368e6-073e-42b3-a8d7-ca128f00c847 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Lazy-loading 'resources' on Instance uuid 3af27bc9-9617-44c7-bfa4-993b347d183c {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 14:00:07 user nova-compute[71474]: DEBUG nova.virt.libvirt.vif [None req-57a368e6-073e-42b3-a8d7-ca128f00c847 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T13:58:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-1376655679',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumeshelvetestjson-server-1376655679',id=3,image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBSfCm4kaW8kKhEYk26quHUV0I+U7qCvNO4t2+x3r4VBiTtZqyaoR2EGTBvf5XEnA51qy75jGfzROX158wPVnPuLv0NMr8g0Jge5rbAgJLAI/7LXIqSONczhV0yG5arkdQ==',key_name='tempest-keypair-2134167482',keypairs=,launch_index=0,launched_at=2023-04-21T13:58:25Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='8a8fedc10f324a92aef4142ab7efdd6a',ramdisk_id='',reservation_id='r-n1llw0zy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-AttachVolumeShelveTestJSON-2115713901',owner_user_name='tempest-AttachVolumeShelveTestJSON-2115713901-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-21T13:58:25Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='92c19bad528a4c38860a43913b28b85b',uuid=3af27bc9-9617-44c7-bfa4-993b347d183c,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7e465dfa-8ae6-4806-b18d-e23dcbf0a97d", "address": "fa:16:3e:da:eb:fa", "network": {"id": "1815b48e-38a4-4a83-a23b-d7c2ce38a2c3", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1971948253-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.38", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8a8fedc10f324a92aef4142ab7efdd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e465dfa-8a", "ovs_interfaceid": "7e465dfa-8ae6-4806-b18d-e23dcbf0a97d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 21 14:00:07 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-57a368e6-073e-42b3-a8d7-ca128f00c847 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Converting VIF {"id": "7e465dfa-8ae6-4806-b18d-e23dcbf0a97d", "address": "fa:16:3e:da:eb:fa", "network": {"id": "1815b48e-38a4-4a83-a23b-d7c2ce38a2c3", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1971948253-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.38", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8a8fedc10f324a92aef4142ab7efdd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e465dfa-8a", "ovs_interfaceid": "7e465dfa-8ae6-4806-b18d-e23dcbf0a97d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 14:00:07 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-57a368e6-073e-42b3-a8d7-ca128f00c847 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:da:eb:fa,bridge_name='br-int',has_traffic_filtering=True,id=7e465dfa-8ae6-4806-b18d-e23dcbf0a97d,network=Network(1815b48e-38a4-4a83-a23b-d7c2ce38a2c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7e465dfa-8a') {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 14:00:07 user nova-compute[71474]: DEBUG os_vif [None req-57a368e6-073e-42b3-a8d7-ca128f00c847 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:da:eb:fa,bridge_name='br-int',has_traffic_filtering=True,id=7e465dfa-8ae6-4806-b18d-e23dcbf0a97d,network=Network(1815b48e-38a4-4a83-a23b-d7c2ce38a2c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7e465dfa-8a') {{(pid=71474) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 21 14:00:07 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:00:07 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7e465dfa-8a, bridge=br-int, if_exists=True) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 14:00:07 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:00:07 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:00:07 user nova-compute[71474]: INFO os_vif [None req-57a368e6-073e-42b3-a8d7-ca128f00c847 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:da:eb:fa,bridge_name='br-int',has_traffic_filtering=True,id=7e465dfa-8ae6-4806-b18d-e23dcbf0a97d,network=Network(1815b48e-38a4-4a83-a23b-d7c2ce38a2c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7e465dfa-8a') Apr 21 14:00:07 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-57a368e6-073e-42b3-a8d7-ca128f00c847 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] [instance: 3af27bc9-9617-44c7-bfa4-993b347d183c] Deleting instance files /opt/stack/data/nova/instances/3af27bc9-9617-44c7-bfa4-993b347d183c_del Apr 21 14:00:07 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-57a368e6-073e-42b3-a8d7-ca128f00c847 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] [instance: 3af27bc9-9617-44c7-bfa4-993b347d183c] Deletion of /opt/stack/data/nova/instances/3af27bc9-9617-44c7-bfa4-993b347d183c_del complete Apr 21 14:00:07 user nova-compute[71474]: INFO nova.compute.manager [None req-57a368e6-073e-42b3-a8d7-ca128f00c847 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] [instance: 3af27bc9-9617-44c7-bfa4-993b347d183c] Took 0.67 seconds to destroy the instance on the hypervisor. Apr 21 14:00:07 user nova-compute[71474]: DEBUG oslo.service.loopingcall [None req-57a368e6-073e-42b3-a8d7-ca128f00c847 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71474) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 21 14:00:07 user nova-compute[71474]: DEBUG nova.compute.manager [-] [instance: 3af27bc9-9617-44c7-bfa4-993b347d183c] Deallocating network for instance {{(pid=71474) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 21 14:00:07 user nova-compute[71474]: DEBUG nova.network.neutron [-] [instance: 3af27bc9-9617-44c7-bfa4-993b347d183c] deallocate_for_instance() {{(pid=71474) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 21 14:00:08 user nova-compute[71474]: DEBUG nova.network.neutron [-] [instance: 3af27bc9-9617-44c7-bfa4-993b347d183c] Updating instance_info_cache with network_info: [] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:00:08 user nova-compute[71474]: INFO nova.compute.manager [-] [instance: 3af27bc9-9617-44c7-bfa4-993b347d183c] Took 0.79 seconds to deallocate network for instance. Apr 21 14:00:08 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-57a368e6-073e-42b3-a8d7-ca128f00c847 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:00:08 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-57a368e6-073e-42b3-a8d7-ca128f00c847 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:00:08 user nova-compute[71474]: DEBUG nova.compute.manager [req-9c9bd7aa-7f92-4917-afa4-d51912f2a238 req-4ca208d8-81b4-4aaa-bbab-be9bbfaa47e3 service nova] [instance: 3af27bc9-9617-44c7-bfa4-993b347d183c] Received event network-vif-deleted-7e465dfa-8ae6-4806-b18d-e23dcbf0a97d {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:00:08 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-57a368e6-073e-42b3-a8d7-ca128f00c847 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Inventory has not changed in ProviderTree for provider: 4e62c1ab-67bb-43ed-8389-61deb50e98d7 {{(pid=71474) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 14:00:08 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-57a368e6-073e-42b3-a8d7-ca128f00c847 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Inventory has not changed for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 14:00:08 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-57a368e6-073e-42b3-a8d7-ca128f00c847 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.310s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:00:08 user nova-compute[71474]: INFO nova.scheduler.client.report [None req-57a368e6-073e-42b3-a8d7-ca128f00c847 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Deleted allocations for instance 3af27bc9-9617-44c7-bfa4-993b347d183c Apr 21 14:00:09 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-57a368e6-073e-42b3-a8d7-ca128f00c847 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Lock "3af27bc9-9617-44c7-bfa4-993b347d183c" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.085s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:00:09 user nova-compute[71474]: DEBUG nova.compute.manager [req-f45994ca-b6d3-4741-86dc-a25cffbf5ee9 req-9ff5da09-0002-459c-a0a3-7777e0ed3db2 service nova] [instance: 3af27bc9-9617-44c7-bfa4-993b347d183c] Received event network-vif-plugged-7e465dfa-8ae6-4806-b18d-e23dcbf0a97d {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:00:09 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-f45994ca-b6d3-4741-86dc-a25cffbf5ee9 req-9ff5da09-0002-459c-a0a3-7777e0ed3db2 service nova] Acquiring lock "3af27bc9-9617-44c7-bfa4-993b347d183c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:00:09 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-f45994ca-b6d3-4741-86dc-a25cffbf5ee9 req-9ff5da09-0002-459c-a0a3-7777e0ed3db2 service nova] Lock "3af27bc9-9617-44c7-bfa4-993b347d183c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:00:09 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-f45994ca-b6d3-4741-86dc-a25cffbf5ee9 req-9ff5da09-0002-459c-a0a3-7777e0ed3db2 service nova] Lock "3af27bc9-9617-44c7-bfa4-993b347d183c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:00:09 user nova-compute[71474]: DEBUG nova.compute.manager [req-f45994ca-b6d3-4741-86dc-a25cffbf5ee9 req-9ff5da09-0002-459c-a0a3-7777e0ed3db2 service nova] [instance: 3af27bc9-9617-44c7-bfa4-993b347d183c] No waiting events found dispatching network-vif-plugged-7e465dfa-8ae6-4806-b18d-e23dcbf0a97d {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:00:09 user nova-compute[71474]: WARNING nova.compute.manager [req-f45994ca-b6d3-4741-86dc-a25cffbf5ee9 req-9ff5da09-0002-459c-a0a3-7777e0ed3db2 service nova] [instance: 3af27bc9-9617-44c7-bfa4-993b347d183c] Received unexpected event network-vif-plugged-7e465dfa-8ae6-4806-b18d-e23dcbf0a97d for instance with vm_state deleted and task_state None. Apr 21 14:00:10 user nova-compute[71474]: DEBUG nova.compute.manager [None req-a71ae1a7-085f-48bc-afd6-68c22df54e9e tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 14:00:10 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:00:10 user nova-compute[71474]: INFO nova.compute.manager [None req-a71ae1a7-085f-48bc-afd6-68c22df54e9e tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] instance snapshotting Apr 21 14:00:10 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-a71ae1a7-085f-48bc-afd6-68c22df54e9e tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] Beginning live snapshot process Apr 21 14:00:11 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-a71ae1a7-085f-48bc-afd6-68c22df54e9e tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2c5afe45-87ae-477a-8bf0-6a5e2036fb68/disk --force-share --output=json -f qcow2 {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:00:11 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-a71ae1a7-085f-48bc-afd6-68c22df54e9e tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2c5afe45-87ae-477a-8bf0-6a5e2036fb68/disk --force-share --output=json -f qcow2" returned: 0 in 0.133s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:00:11 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-a71ae1a7-085f-48bc-afd6-68c22df54e9e tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2c5afe45-87ae-477a-8bf0-6a5e2036fb68/disk --force-share --output=json -f qcow2 {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:00:11 user nova-compute[71474]: DEBUG nova.compute.manager [req-5ac8ea38-8363-4a64-87ae-a9ce26808ce0 req-7ea1ed57-fc22-4321-8986-94082e710fd8 service nova] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] Received event network-changed-2616f5a4-1b53-44bd-82ad-65419e2839ca {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:00:11 user nova-compute[71474]: DEBUG nova.compute.manager [req-5ac8ea38-8363-4a64-87ae-a9ce26808ce0 req-7ea1ed57-fc22-4321-8986-94082e710fd8 service nova] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] Refreshing instance network info cache due to event network-changed-2616f5a4-1b53-44bd-82ad-65419e2839ca. {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 21 14:00:11 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-5ac8ea38-8363-4a64-87ae-a9ce26808ce0 req-7ea1ed57-fc22-4321-8986-94082e710fd8 service nova] Acquiring lock "refresh_cache-2c5afe45-87ae-477a-8bf0-6a5e2036fb68" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 14:00:11 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-5ac8ea38-8363-4a64-87ae-a9ce26808ce0 req-7ea1ed57-fc22-4321-8986-94082e710fd8 service nova] Acquired lock "refresh_cache-2c5afe45-87ae-477a-8bf0-6a5e2036fb68" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 14:00:11 user nova-compute[71474]: DEBUG nova.network.neutron [req-5ac8ea38-8363-4a64-87ae-a9ce26808ce0 req-7ea1ed57-fc22-4321-8986-94082e710fd8 service nova] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] Refreshing network info cache for port 2616f5a4-1b53-44bd-82ad-65419e2839ca {{(pid=71474) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 21 14:00:11 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-a71ae1a7-085f-48bc-afd6-68c22df54e9e tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2c5afe45-87ae-477a-8bf0-6a5e2036fb68/disk --force-share --output=json -f qcow2" returned: 0 in 0.149s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:00:11 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-a71ae1a7-085f-48bc-afd6-68c22df54e9e tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:00:11 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-a71ae1a7-085f-48bc-afd6-68c22df54e9e tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json" returned: 0 in 0.146s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:00:11 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-a71ae1a7-085f-48bc-afd6-68c22df54e9e tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7,backing_fmt=raw /opt/stack/data/nova/instances/snapshots/tmps_u3ht3c/33ed2aeabe8447b3b85e55d899f9f401.delta 1073741824 {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:00:11 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-a71ae1a7-085f-48bc-afd6-68c22df54e9e tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7,backing_fmt=raw /opt/stack/data/nova/instances/snapshots/tmps_u3ht3c/33ed2aeabe8447b3b85e55d899f9f401.delta 1073741824" returned: 0 in 0.042s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:00:11 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-a71ae1a7-085f-48bc-afd6-68c22df54e9e tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] Quiescing instance not available: QEMU guest agent is not enabled. Apr 21 14:00:11 user nova-compute[71474]: DEBUG nova.network.neutron [req-5ac8ea38-8363-4a64-87ae-a9ce26808ce0 req-7ea1ed57-fc22-4321-8986-94082e710fd8 service nova] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] Updated VIF entry in instance network info cache for port 2616f5a4-1b53-44bd-82ad-65419e2839ca. {{(pid=71474) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 21 14:00:11 user nova-compute[71474]: DEBUG nova.network.neutron [req-5ac8ea38-8363-4a64-87ae-a9ce26808ce0 req-7ea1ed57-fc22-4321-8986-94082e710fd8 service nova] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] Updating instance_info_cache with network_info: [{"id": "2616f5a4-1b53-44bd-82ad-65419e2839ca", "address": "fa:16:3e:52:41:c0", "network": {"id": "43525cbd-9d02-4cd7-b457-b26a485106f5", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1453701117-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.197", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "b4c4270d6dfa435f94da018d12586bcd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2616f5a4-1b", "ovs_interfaceid": "2616f5a4-1b53-44bd-82ad-65419e2839ca", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:00:11 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-5ac8ea38-8363-4a64-87ae-a9ce26808ce0 req-7ea1ed57-fc22-4321-8986-94082e710fd8 service nova] Releasing lock "refresh_cache-2c5afe45-87ae-477a-8bf0-6a5e2036fb68" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 14:00:12 user nova-compute[71474]: DEBUG nova.virt.libvirt.guest [None req-a71ae1a7-085f-48bc-afd6-68c22df54e9e tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] COPY block job progress, current cursor: 0 final cursor: 43778048 {{(pid=71474) is_job_complete /opt/stack/nova/nova/virt/libvirt/guest.py:846}} Apr 21 14:00:12 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:00:12 user nova-compute[71474]: DEBUG nova.virt.libvirt.guest [None req-a71ae1a7-085f-48bc-afd6-68c22df54e9e tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] COPY block job progress, current cursor: 43778048 final cursor: 43778048 {{(pid=71474) is_job_complete /opt/stack/nova/nova/virt/libvirt/guest.py:846}} Apr 21 14:00:12 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-a71ae1a7-085f-48bc-afd6-68c22df54e9e tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] Skipping quiescing instance: QEMU guest agent is not enabled. Apr 21 14:00:12 user nova-compute[71474]: DEBUG nova.privsep.utils [None req-a71ae1a7-085f-48bc-afd6-68c22df54e9e tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=71474) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Apr 21 14:00:12 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-a71ae1a7-085f-48bc-afd6-68c22df54e9e tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /opt/stack/data/nova/instances/snapshots/tmps_u3ht3c/33ed2aeabe8447b3b85e55d899f9f401.delta /opt/stack/data/nova/instances/snapshots/tmps_u3ht3c/33ed2aeabe8447b3b85e55d899f9f401 {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:00:12 user nova-compute[71474]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71474) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 14:00:12 user nova-compute[71474]: INFO nova.compute.manager [-] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] VM Stopped (Lifecycle Event) Apr 21 14:00:12 user nova-compute[71474]: DEBUG nova.compute.manager [None req-db9e8fc3-b46a-4868-aaf1-f4c76522ad4b None None] [instance: 5030decd-cbe5-4495-b497-dfacf25eef73] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 14:00:13 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-a71ae1a7-085f-48bc-afd6-68c22df54e9e tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /opt/stack/data/nova/instances/snapshots/tmps_u3ht3c/33ed2aeabe8447b3b85e55d899f9f401.delta /opt/stack/data/nova/instances/snapshots/tmps_u3ht3c/33ed2aeabe8447b3b85e55d899f9f401" returned: 0 in 0.194s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:00:13 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-a71ae1a7-085f-48bc-afd6-68c22df54e9e tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] Snapshot extracted, beginning image upload Apr 21 14:00:15 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-a71ae1a7-085f-48bc-afd6-68c22df54e9e tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] Snapshot image upload complete Apr 21 14:00:15 user nova-compute[71474]: INFO nova.compute.manager [None req-a71ae1a7-085f-48bc-afd6-68c22df54e9e tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] Took 4.32 seconds to snapshot the instance on the hypervisor. Apr 21 14:00:15 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:00:17 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:00:20 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:00:22 user nova-compute[71474]: DEBUG nova.compute.manager [req-174db664-3469-47d4-adc8-28aac86fc9b5 req-2bf520a9-4ab3-41f1-975a-725f1c32da6c service nova] [instance: 0346fbd8-64cd-45e7-906f-e00eeece91ce] Received event network-changed-94bfdf6c-66ef-44cc-ab29-c1b1597cf6a8 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:00:22 user nova-compute[71474]: DEBUG nova.compute.manager [req-174db664-3469-47d4-adc8-28aac86fc9b5 req-2bf520a9-4ab3-41f1-975a-725f1c32da6c service nova] [instance: 0346fbd8-64cd-45e7-906f-e00eeece91ce] Refreshing instance network info cache due to event network-changed-94bfdf6c-66ef-44cc-ab29-c1b1597cf6a8. {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 21 14:00:22 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-174db664-3469-47d4-adc8-28aac86fc9b5 req-2bf520a9-4ab3-41f1-975a-725f1c32da6c service nova] Acquiring lock "refresh_cache-0346fbd8-64cd-45e7-906f-e00eeece91ce" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 14:00:22 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-174db664-3469-47d4-adc8-28aac86fc9b5 req-2bf520a9-4ab3-41f1-975a-725f1c32da6c service nova] Acquired lock "refresh_cache-0346fbd8-64cd-45e7-906f-e00eeece91ce" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 14:00:22 user nova-compute[71474]: DEBUG nova.network.neutron [req-174db664-3469-47d4-adc8-28aac86fc9b5 req-2bf520a9-4ab3-41f1-975a-725f1c32da6c service nova] [instance: 0346fbd8-64cd-45e7-906f-e00eeece91ce] Refreshing network info cache for port 94bfdf6c-66ef-44cc-ab29-c1b1597cf6a8 {{(pid=71474) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 21 14:00:22 user nova-compute[71474]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71474) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 14:00:22 user nova-compute[71474]: INFO nova.compute.manager [-] [instance: 3af27bc9-9617-44c7-bfa4-993b347d183c] VM Stopped (Lifecycle Event) Apr 21 14:00:22 user nova-compute[71474]: DEBUG nova.compute.manager [None req-78cb36c1-9ae0-4e73-a73d-bde061fadba0 None None] [instance: 3af27bc9-9617-44c7-bfa4-993b347d183c] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 14:00:22 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:00:22 user nova-compute[71474]: DEBUG nova.network.neutron [req-174db664-3469-47d4-adc8-28aac86fc9b5 req-2bf520a9-4ab3-41f1-975a-725f1c32da6c service nova] [instance: 0346fbd8-64cd-45e7-906f-e00eeece91ce] Updated VIF entry in instance network info cache for port 94bfdf6c-66ef-44cc-ab29-c1b1597cf6a8. {{(pid=71474) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 21 14:00:22 user nova-compute[71474]: DEBUG nova.network.neutron [req-174db664-3469-47d4-adc8-28aac86fc9b5 req-2bf520a9-4ab3-41f1-975a-725f1c32da6c service nova] [instance: 0346fbd8-64cd-45e7-906f-e00eeece91ce] Updating instance_info_cache with network_info: [{"id": "94bfdf6c-66ef-44cc-ab29-c1b1597cf6a8", "address": "fa:16:3e:53:67:9d", "network": {"id": "d373a2f1-e506-4b6b-ab93-37e77bb02c7a", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-197870629-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.189", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1953479f081341b088eaecb8369fce0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap94bfdf6c-66", "ovs_interfaceid": "94bfdf6c-66ef-44cc-ab29-c1b1597cf6a8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:00:22 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-174db664-3469-47d4-adc8-28aac86fc9b5 req-2bf520a9-4ab3-41f1-975a-725f1c32da6c service nova] Releasing lock "refresh_cache-0346fbd8-64cd-45e7-906f-e00eeece91ce" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 14:00:23 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-1879b097-5784-4ece-9263-c4c4f51fdb95 tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] Acquiring lock "0346fbd8-64cd-45e7-906f-e00eeece91ce" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:00:23 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-1879b097-5784-4ece-9263-c4c4f51fdb95 tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] Lock "0346fbd8-64cd-45e7-906f-e00eeece91ce" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:00:23 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-1879b097-5784-4ece-9263-c4c4f51fdb95 tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] Acquiring lock "0346fbd8-64cd-45e7-906f-e00eeece91ce-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:00:23 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-1879b097-5784-4ece-9263-c4c4f51fdb95 tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] Lock "0346fbd8-64cd-45e7-906f-e00eeece91ce-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:00:23 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-1879b097-5784-4ece-9263-c4c4f51fdb95 tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] Lock "0346fbd8-64cd-45e7-906f-e00eeece91ce-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:00:23 user nova-compute[71474]: INFO nova.compute.manager [None req-1879b097-5784-4ece-9263-c4c4f51fdb95 tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] [instance: 0346fbd8-64cd-45e7-906f-e00eeece91ce] Terminating instance Apr 21 14:00:23 user nova-compute[71474]: DEBUG nova.compute.manager [None req-1879b097-5784-4ece-9263-c4c4f51fdb95 tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] [instance: 0346fbd8-64cd-45e7-906f-e00eeece91ce] Start destroying the instance on the hypervisor. {{(pid=71474) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 21 14:00:23 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:00:23 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:00:23 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:00:23 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:00:23 user nova-compute[71474]: DEBUG nova.compute.manager [req-8121df1a-05b1-4cc5-9aaa-2538fa25e41a req-611760fb-7d2f-426c-812d-75ad4602e477 service nova] [instance: 0346fbd8-64cd-45e7-906f-e00eeece91ce] Received event network-vif-unplugged-94bfdf6c-66ef-44cc-ab29-c1b1597cf6a8 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:00:23 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-8121df1a-05b1-4cc5-9aaa-2538fa25e41a req-611760fb-7d2f-426c-812d-75ad4602e477 service nova] Acquiring lock "0346fbd8-64cd-45e7-906f-e00eeece91ce-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:00:23 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-8121df1a-05b1-4cc5-9aaa-2538fa25e41a req-611760fb-7d2f-426c-812d-75ad4602e477 service nova] Lock "0346fbd8-64cd-45e7-906f-e00eeece91ce-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:00:23 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-8121df1a-05b1-4cc5-9aaa-2538fa25e41a req-611760fb-7d2f-426c-812d-75ad4602e477 service nova] Lock "0346fbd8-64cd-45e7-906f-e00eeece91ce-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:00:23 user nova-compute[71474]: DEBUG nova.compute.manager [req-8121df1a-05b1-4cc5-9aaa-2538fa25e41a req-611760fb-7d2f-426c-812d-75ad4602e477 service nova] [instance: 0346fbd8-64cd-45e7-906f-e00eeece91ce] No waiting events found dispatching network-vif-unplugged-94bfdf6c-66ef-44cc-ab29-c1b1597cf6a8 {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:00:23 user nova-compute[71474]: DEBUG nova.compute.manager [req-8121df1a-05b1-4cc5-9aaa-2538fa25e41a req-611760fb-7d2f-426c-812d-75ad4602e477 service nova] [instance: 0346fbd8-64cd-45e7-906f-e00eeece91ce] Received event network-vif-unplugged-94bfdf6c-66ef-44cc-ab29-c1b1597cf6a8 for instance with task_state deleting. {{(pid=71474) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 21 14:00:24 user nova-compute[71474]: INFO nova.virt.libvirt.driver [-] [instance: 0346fbd8-64cd-45e7-906f-e00eeece91ce] Instance destroyed successfully. Apr 21 14:00:24 user nova-compute[71474]: DEBUG nova.objects.instance [None req-1879b097-5784-4ece-9263-c4c4f51fdb95 tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] Lazy-loading 'resources' on Instance uuid 0346fbd8-64cd-45e7-906f-e00eeece91ce {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 14:00:24 user nova-compute[71474]: DEBUG nova.virt.libvirt.vif [None req-1879b097-5784-4ece-9263-c4c4f51fdb95 tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2023-04-21T13:58:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-AttachSCSIVolumeTestJSON-server-704385804',display_name='tempest-AttachSCSIVolumeTestJSON-server-704385804',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachscsivolumetestjson-server-704385804',id=5,image_ref='2be63a0e-17cf-4166-a64b-96ec2f419df8',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBH1e308suG6Up9XacMAZI+/gRfmiDMDOuigLvKXnubV1WGNa08tjA4vufwk45vBBKQZeBxNXGuSgt+lX8T8X9YT0Q/e8fHr4reV0+dJepejAd/TWi7ALI+M/7BbZu2uxsg==',key_name='tempest-keypair-198905762',keypairs=,launch_index=0,launched_at=2023-04-21T13:58:43Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='1953479f081341b088eaecb8369fce0b',ramdisk_id='',reservation_id='r-26d32h24',resources=None,root_device_name='/dev/sda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2be63a0e-17cf-4166-a64b-96ec2f419df8',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='scsi',image_hw_disk_bus='scsi',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_scsi_model='virtio-scsi',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachSCSIVolumeTestJSON-1130428952',owner_user_name='tempest-AttachSCSIVolumeTestJSON-1130428952-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-21T13:58:43Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='7ac1dc66f96249f884f9b4dbc73f37b4',uuid=0346fbd8-64cd-45e7-906f-e00eeece91ce,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "94bfdf6c-66ef-44cc-ab29-c1b1597cf6a8", "address": "fa:16:3e:53:67:9d", "network": {"id": "d373a2f1-e506-4b6b-ab93-37e77bb02c7a", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-197870629-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.189", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1953479f081341b088eaecb8369fce0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap94bfdf6c-66", "ovs_interfaceid": "94bfdf6c-66ef-44cc-ab29-c1b1597cf6a8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 21 14:00:24 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-1879b097-5784-4ece-9263-c4c4f51fdb95 tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] Converting VIF {"id": "94bfdf6c-66ef-44cc-ab29-c1b1597cf6a8", "address": "fa:16:3e:53:67:9d", "network": {"id": "d373a2f1-e506-4b6b-ab93-37e77bb02c7a", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-197870629-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.189", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1953479f081341b088eaecb8369fce0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap94bfdf6c-66", "ovs_interfaceid": "94bfdf6c-66ef-44cc-ab29-c1b1597cf6a8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 14:00:24 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-1879b097-5784-4ece-9263-c4c4f51fdb95 tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:53:67:9d,bridge_name='br-int',has_traffic_filtering=True,id=94bfdf6c-66ef-44cc-ab29-c1b1597cf6a8,network=Network(d373a2f1-e506-4b6b-ab93-37e77bb02c7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap94bfdf6c-66') {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 14:00:24 user nova-compute[71474]: DEBUG os_vif [None req-1879b097-5784-4ece-9263-c4c4f51fdb95 tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:53:67:9d,bridge_name='br-int',has_traffic_filtering=True,id=94bfdf6c-66ef-44cc-ab29-c1b1597cf6a8,network=Network(d373a2f1-e506-4b6b-ab93-37e77bb02c7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap94bfdf6c-66') {{(pid=71474) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 21 14:00:24 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:00:24 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap94bfdf6c-66, bridge=br-int, if_exists=True) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 14:00:24 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:00:24 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:00:24 user nova-compute[71474]: INFO os_vif [None req-1879b097-5784-4ece-9263-c4c4f51fdb95 tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:53:67:9d,bridge_name='br-int',has_traffic_filtering=True,id=94bfdf6c-66ef-44cc-ab29-c1b1597cf6a8,network=Network(d373a2f1-e506-4b6b-ab93-37e77bb02c7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap94bfdf6c-66') Apr 21 14:00:24 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-1879b097-5784-4ece-9263-c4c4f51fdb95 tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] [instance: 0346fbd8-64cd-45e7-906f-e00eeece91ce] Deleting instance files /opt/stack/data/nova/instances/0346fbd8-64cd-45e7-906f-e00eeece91ce_del Apr 21 14:00:24 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-1879b097-5784-4ece-9263-c4c4f51fdb95 tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] [instance: 0346fbd8-64cd-45e7-906f-e00eeece91ce] Deletion of /opt/stack/data/nova/instances/0346fbd8-64cd-45e7-906f-e00eeece91ce_del complete Apr 21 14:00:24 user nova-compute[71474]: INFO nova.compute.manager [None req-1879b097-5784-4ece-9263-c4c4f51fdb95 tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] [instance: 0346fbd8-64cd-45e7-906f-e00eeece91ce] Took 0.86 seconds to destroy the instance on the hypervisor. Apr 21 14:00:24 user nova-compute[71474]: DEBUG oslo.service.loopingcall [None req-1879b097-5784-4ece-9263-c4c4f51fdb95 tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71474) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 21 14:00:24 user nova-compute[71474]: DEBUG nova.compute.manager [-] [instance: 0346fbd8-64cd-45e7-906f-e00eeece91ce] Deallocating network for instance {{(pid=71474) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 21 14:00:24 user nova-compute[71474]: DEBUG nova.network.neutron [-] [instance: 0346fbd8-64cd-45e7-906f-e00eeece91ce] deallocate_for_instance() {{(pid=71474) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 21 14:00:25 user nova-compute[71474]: DEBUG nova.compute.manager [req-5934d589-a1d9-465d-b5db-73dfaa6604c3 req-59ce05cd-0940-4711-9c7c-5fb7fc472bdf service nova] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] Received event network-changed-20ca5a57-3cd5-47ad-bdfe-f56a0ecd078b {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:00:25 user nova-compute[71474]: DEBUG nova.compute.manager [req-5934d589-a1d9-465d-b5db-73dfaa6604c3 req-59ce05cd-0940-4711-9c7c-5fb7fc472bdf service nova] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] Refreshing instance network info cache due to event network-changed-20ca5a57-3cd5-47ad-bdfe-f56a0ecd078b. {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 21 14:00:25 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-5934d589-a1d9-465d-b5db-73dfaa6604c3 req-59ce05cd-0940-4711-9c7c-5fb7fc472bdf service nova] Acquiring lock "refresh_cache-f0f32b68-6993-4843-bcc6-bd0e06377b27" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 14:00:25 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-5934d589-a1d9-465d-b5db-73dfaa6604c3 req-59ce05cd-0940-4711-9c7c-5fb7fc472bdf service nova] Acquired lock "refresh_cache-f0f32b68-6993-4843-bcc6-bd0e06377b27" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 14:00:25 user nova-compute[71474]: DEBUG nova.network.neutron [req-5934d589-a1d9-465d-b5db-73dfaa6604c3 req-59ce05cd-0940-4711-9c7c-5fb7fc472bdf service nova] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] Refreshing network info cache for port 20ca5a57-3cd5-47ad-bdfe-f56a0ecd078b {{(pid=71474) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 21 14:00:25 user nova-compute[71474]: DEBUG nova.network.neutron [-] [instance: 0346fbd8-64cd-45e7-906f-e00eeece91ce] Updating instance_info_cache with network_info: [] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:00:25 user nova-compute[71474]: INFO nova.compute.manager [-] [instance: 0346fbd8-64cd-45e7-906f-e00eeece91ce] Took 0.82 seconds to deallocate network for instance. Apr 21 14:00:25 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-1879b097-5784-4ece-9263-c4c4f51fdb95 tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:00:25 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-1879b097-5784-4ece-9263-c4c4f51fdb95 tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:00:25 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-1879b097-5784-4ece-9263-c4c4f51fdb95 tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] Inventory has not changed in ProviderTree for provider: 4e62c1ab-67bb-43ed-8389-61deb50e98d7 {{(pid=71474) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 14:00:25 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-1879b097-5784-4ece-9263-c4c4f51fdb95 tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] Inventory has not changed for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 14:00:25 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-1879b097-5784-4ece-9263-c4c4f51fdb95 tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.293s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:00:25 user nova-compute[71474]: INFO nova.scheduler.client.report [None req-1879b097-5784-4ece-9263-c4c4f51fdb95 tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] Deleted allocations for instance 0346fbd8-64cd-45e7-906f-e00eeece91ce Apr 21 14:00:25 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:00:25 user nova-compute[71474]: DEBUG nova.network.neutron [req-5934d589-a1d9-465d-b5db-73dfaa6604c3 req-59ce05cd-0940-4711-9c7c-5fb7fc472bdf service nova] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] Updated VIF entry in instance network info cache for port 20ca5a57-3cd5-47ad-bdfe-f56a0ecd078b. {{(pid=71474) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 21 14:00:25 user nova-compute[71474]: DEBUG nova.network.neutron [req-5934d589-a1d9-465d-b5db-73dfaa6604c3 req-59ce05cd-0940-4711-9c7c-5fb7fc472bdf service nova] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] Updating instance_info_cache with network_info: [{"id": "20ca5a57-3cd5-47ad-bdfe-f56a0ecd078b", "address": "fa:16:3e:02:78:94", "network": {"id": "6e372a6f-6444-4977-be86-7a6bb86d8979", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-2058149994-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.41", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "15f83d6d2c3049e9ba1ac7f04ad2ebb0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap20ca5a57-3c", "ovs_interfaceid": "20ca5a57-3cd5-47ad-bdfe-f56a0ecd078b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:00:25 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-5934d589-a1d9-465d-b5db-73dfaa6604c3 req-59ce05cd-0940-4711-9c7c-5fb7fc472bdf service nova] Releasing lock "refresh_cache-f0f32b68-6993-4843-bcc6-bd0e06377b27" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 14:00:25 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-1879b097-5784-4ece-9263-c4c4f51fdb95 tempest-AttachSCSIVolumeTestJSON-1130428952 tempest-AttachSCSIVolumeTestJSON-1130428952-project-member] Lock "0346fbd8-64cd-45e7-906f-e00eeece91ce" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.176s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:00:26 user nova-compute[71474]: DEBUG nova.compute.manager [req-03956135-7234-4666-8b45-7633822ed831 req-57c24df2-0e78-48f3-8b63-cc318b4e82a3 service nova] [instance: 0346fbd8-64cd-45e7-906f-e00eeece91ce] Received event network-vif-plugged-94bfdf6c-66ef-44cc-ab29-c1b1597cf6a8 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:00:26 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-03956135-7234-4666-8b45-7633822ed831 req-57c24df2-0e78-48f3-8b63-cc318b4e82a3 service nova] Acquiring lock "0346fbd8-64cd-45e7-906f-e00eeece91ce-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:00:26 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-03956135-7234-4666-8b45-7633822ed831 req-57c24df2-0e78-48f3-8b63-cc318b4e82a3 service nova] Lock "0346fbd8-64cd-45e7-906f-e00eeece91ce-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:00:26 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-03956135-7234-4666-8b45-7633822ed831 req-57c24df2-0e78-48f3-8b63-cc318b4e82a3 service nova] Lock "0346fbd8-64cd-45e7-906f-e00eeece91ce-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:00:26 user nova-compute[71474]: DEBUG nova.compute.manager [req-03956135-7234-4666-8b45-7633822ed831 req-57c24df2-0e78-48f3-8b63-cc318b4e82a3 service nova] [instance: 0346fbd8-64cd-45e7-906f-e00eeece91ce] No waiting events found dispatching network-vif-plugged-94bfdf6c-66ef-44cc-ab29-c1b1597cf6a8 {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:00:26 user nova-compute[71474]: WARNING nova.compute.manager [req-03956135-7234-4666-8b45-7633822ed831 req-57c24df2-0e78-48f3-8b63-cc318b4e82a3 service nova] [instance: 0346fbd8-64cd-45e7-906f-e00eeece91ce] Received unexpected event network-vif-plugged-94bfdf6c-66ef-44cc-ab29-c1b1597cf6a8 for instance with vm_state deleted and task_state None. Apr 21 14:00:26 user nova-compute[71474]: DEBUG nova.compute.manager [req-03956135-7234-4666-8b45-7633822ed831 req-57c24df2-0e78-48f3-8b63-cc318b4e82a3 service nova] [instance: 0346fbd8-64cd-45e7-906f-e00eeece91ce] Received event network-vif-deleted-94bfdf6c-66ef-44cc-ab29-c1b1597cf6a8 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:00:27 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Acquiring lock "9164203a-8a6b-4078-bd98-c5ea7bc111fa" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:00:27 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Lock "9164203a-8a6b-4078-bd98-c5ea7bc111fa" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:00:27 user nova-compute[71474]: DEBUG nova.compute.manager [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] [instance: 9164203a-8a6b-4078-bd98-c5ea7bc111fa] Starting instance... {{(pid=71474) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 21 14:00:27 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:00:27 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:00:27 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71474) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 21 14:00:27 user nova-compute[71474]: INFO nova.compute.claims [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] [instance: 9164203a-8a6b-4078-bd98-c5ea7bc111fa] Claim successful on node user Apr 21 14:00:28 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Inventory has not changed in ProviderTree for provider: 4e62c1ab-67bb-43ed-8389-61deb50e98d7 {{(pid=71474) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 14:00:28 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Inventory has not changed for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 14:00:28 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.355s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:00:28 user nova-compute[71474]: DEBUG nova.compute.manager [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] [instance: 9164203a-8a6b-4078-bd98-c5ea7bc111fa] Start building networks asynchronously for instance. {{(pid=71474) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 21 14:00:28 user nova-compute[71474]: DEBUG nova.compute.manager [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] [instance: 9164203a-8a6b-4078-bd98-c5ea7bc111fa] Allocating IP information in the background. {{(pid=71474) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 21 14:00:28 user nova-compute[71474]: DEBUG nova.network.neutron [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] [instance: 9164203a-8a6b-4078-bd98-c5ea7bc111fa] allocate_for_instance() {{(pid=71474) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 21 14:00:28 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] [instance: 9164203a-8a6b-4078-bd98-c5ea7bc111fa] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 21 14:00:28 user nova-compute[71474]: DEBUG nova.compute.manager [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] [instance: 9164203a-8a6b-4078-bd98-c5ea7bc111fa] Start building block device mappings for instance. {{(pid=71474) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 21 14:00:28 user nova-compute[71474]: DEBUG nova.policy [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b60caf53ee58417cb76a77c963a45ec2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '15f83d6d2c3049e9ba1ac7f04ad2ebb0', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71474) authorize /opt/stack/nova/nova/policy.py:203}} Apr 21 14:00:28 user nova-compute[71474]: DEBUG nova.compute.manager [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] [instance: 9164203a-8a6b-4078-bd98-c5ea7bc111fa] Start spawning the instance on the hypervisor. {{(pid=71474) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 21 14:00:28 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] [instance: 9164203a-8a6b-4078-bd98-c5ea7bc111fa] Creating instance directory {{(pid=71474) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 21 14:00:28 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] [instance: 9164203a-8a6b-4078-bd98-c5ea7bc111fa] Creating image(s) Apr 21 14:00:28 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Acquiring lock "/opt/stack/data/nova/instances/9164203a-8a6b-4078-bd98-c5ea7bc111fa/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:00:28 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Lock "/opt/stack/data/nova/instances/9164203a-8a6b-4078-bd98-c5ea7bc111fa/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:00:28 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Lock "/opt/stack/data/nova/instances/9164203a-8a6b-4078-bd98-c5ea7bc111fa/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:00:28 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:00:28 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json" returned: 0 in 0.137s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:00:28 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Acquiring lock "8e8c288cb98f22f6af31ad55f38b7baa81c260d7" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:00:28 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Lock "8e8c288cb98f22f6af31ad55f38b7baa81c260d7" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:00:28 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:00:28 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json" returned: 0 in 0.131s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:00:28 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7,backing_fmt=raw /opt/stack/data/nova/instances/9164203a-8a6b-4078-bd98-c5ea7bc111fa/disk 1073741824 {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:00:28 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7,backing_fmt=raw /opt/stack/data/nova/instances/9164203a-8a6b-4078-bd98-c5ea7bc111fa/disk 1073741824" returned: 0 in 0.056s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:00:28 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Lock "8e8c288cb98f22f6af31ad55f38b7baa81c260d7" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.193s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:00:28 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:00:28 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json" returned: 0 in 0.134s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:00:28 user nova-compute[71474]: DEBUG nova.virt.disk.api [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Checking if we can resize image /opt/stack/data/nova/instances/9164203a-8a6b-4078-bd98-c5ea7bc111fa/disk. size=1073741824 {{(pid=71474) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 21 14:00:28 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/9164203a-8a6b-4078-bd98-c5ea7bc111fa/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:00:28 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/9164203a-8a6b-4078-bd98-c5ea7bc111fa/disk --force-share --output=json" returned: 0 in 0.134s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:00:28 user nova-compute[71474]: DEBUG nova.virt.disk.api [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Cannot resize image /opt/stack/data/nova/instances/9164203a-8a6b-4078-bd98-c5ea7bc111fa/disk to a smaller size. {{(pid=71474) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 21 14:00:28 user nova-compute[71474]: DEBUG nova.objects.instance [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Lazy-loading 'migration_context' on Instance uuid 9164203a-8a6b-4078-bd98-c5ea7bc111fa {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 14:00:28 user nova-compute[71474]: DEBUG nova.network.neutron [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] [instance: 9164203a-8a6b-4078-bd98-c5ea7bc111fa] Successfully created port: 70605424-311e-401f-b769-3e037210f46a {{(pid=71474) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 21 14:00:28 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] [instance: 9164203a-8a6b-4078-bd98-c5ea7bc111fa] Created local disks {{(pid=71474) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 21 14:00:28 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] [instance: 9164203a-8a6b-4078-bd98-c5ea7bc111fa] Ensure instance console log exists: /opt/stack/data/nova/instances/9164203a-8a6b-4078-bd98-c5ea7bc111fa/console.log {{(pid=71474) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 21 14:00:28 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:00:28 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:00:28 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:00:29 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:00:29 user nova-compute[71474]: DEBUG nova.network.neutron [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] [instance: 9164203a-8a6b-4078-bd98-c5ea7bc111fa] Successfully updated port: 70605424-311e-401f-b769-3e037210f46a {{(pid=71474) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 21 14:00:29 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Acquiring lock "refresh_cache-9164203a-8a6b-4078-bd98-c5ea7bc111fa" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 14:00:29 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Acquired lock "refresh_cache-9164203a-8a6b-4078-bd98-c5ea7bc111fa" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 14:00:29 user nova-compute[71474]: DEBUG nova.network.neutron [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] [instance: 9164203a-8a6b-4078-bd98-c5ea7bc111fa] Building network info cache for instance {{(pid=71474) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 21 14:00:29 user nova-compute[71474]: DEBUG nova.compute.manager [req-ff75b5d5-7fa9-4462-bffa-6ca45e199d0a req-b7c6e3b5-976a-4318-9daf-f9db39dc7a72 service nova] [instance: 9164203a-8a6b-4078-bd98-c5ea7bc111fa] Received event network-changed-70605424-311e-401f-b769-3e037210f46a {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:00:29 user nova-compute[71474]: DEBUG nova.compute.manager [req-ff75b5d5-7fa9-4462-bffa-6ca45e199d0a req-b7c6e3b5-976a-4318-9daf-f9db39dc7a72 service nova] [instance: 9164203a-8a6b-4078-bd98-c5ea7bc111fa] Refreshing instance network info cache due to event network-changed-70605424-311e-401f-b769-3e037210f46a. {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 21 14:00:29 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-ff75b5d5-7fa9-4462-bffa-6ca45e199d0a req-b7c6e3b5-976a-4318-9daf-f9db39dc7a72 service nova] Acquiring lock "refresh_cache-9164203a-8a6b-4078-bd98-c5ea7bc111fa" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 14:00:29 user nova-compute[71474]: DEBUG nova.network.neutron [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] [instance: 9164203a-8a6b-4078-bd98-c5ea7bc111fa] Instance cache missing network info. {{(pid=71474) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 21 14:00:30 user nova-compute[71474]: DEBUG nova.network.neutron [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] [instance: 9164203a-8a6b-4078-bd98-c5ea7bc111fa] Updating instance_info_cache with network_info: [{"id": "70605424-311e-401f-b769-3e037210f46a", "address": "fa:16:3e:b4:50:60", "network": {"id": "6e372a6f-6444-4977-be86-7a6bb86d8979", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-2058149994-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "15f83d6d2c3049e9ba1ac7f04ad2ebb0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap70605424-31", "ovs_interfaceid": "70605424-311e-401f-b769-3e037210f46a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:00:30 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Releasing lock "refresh_cache-9164203a-8a6b-4078-bd98-c5ea7bc111fa" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 14:00:30 user nova-compute[71474]: DEBUG nova.compute.manager [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] [instance: 9164203a-8a6b-4078-bd98-c5ea7bc111fa] Instance network_info: |[{"id": "70605424-311e-401f-b769-3e037210f46a", "address": "fa:16:3e:b4:50:60", "network": {"id": "6e372a6f-6444-4977-be86-7a6bb86d8979", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-2058149994-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "15f83d6d2c3049e9ba1ac7f04ad2ebb0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap70605424-31", "ovs_interfaceid": "70605424-311e-401f-b769-3e037210f46a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71474) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 21 14:00:30 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-ff75b5d5-7fa9-4462-bffa-6ca45e199d0a req-b7c6e3b5-976a-4318-9daf-f9db39dc7a72 service nova] Acquired lock "refresh_cache-9164203a-8a6b-4078-bd98-c5ea7bc111fa" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 14:00:30 user nova-compute[71474]: DEBUG nova.network.neutron [req-ff75b5d5-7fa9-4462-bffa-6ca45e199d0a req-b7c6e3b5-976a-4318-9daf-f9db39dc7a72 service nova] [instance: 9164203a-8a6b-4078-bd98-c5ea7bc111fa] Refreshing network info cache for port 70605424-311e-401f-b769-3e037210f46a {{(pid=71474) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 21 14:00:30 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] [instance: 9164203a-8a6b-4078-bd98-c5ea7bc111fa] Start _get_guest_xml network_info=[{"id": "70605424-311e-401f-b769-3e037210f46a", "address": "fa:16:3e:b4:50:60", "network": {"id": "6e372a6f-6444-4977-be86-7a6bb86d8979", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-2058149994-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "15f83d6d2c3049e9ba1ac7f04ad2ebb0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap70605424-31", "ovs_interfaceid": "70605424-311e-401f-b769-3e037210f46a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T13:54:16Z,direct_url=,disk_format='qcow2',id=2edfef44-2867-4e03-a53e-b139f99afa75,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='36a44032fda748c1965c722304fa176d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T13:54:18Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'guest_format': None, 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'image_id': '2edfef44-2867-4e03-a53e-b139f99afa75'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71474) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 21 14:00:30 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 14:00:30 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 14:00:30 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71474) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 21 14:00:30 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-21T13:55:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T13:54:16Z,direct_url=,disk_format='qcow2',id=2edfef44-2867-4e03-a53e-b139f99afa75,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='36a44032fda748c1965c722304fa176d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T13:54:18Z,virtual_size=,visibility=), allow threads: True {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 21 14:00:30 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Flavor limits 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 21 14:00:30 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Image limits 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 21 14:00:30 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Flavor pref 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 21 14:00:30 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Image pref 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 21 14:00:30 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 21 14:00:30 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 21 14:00:30 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71474) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 21 14:00:30 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Got 1 possible topologies {{(pid=71474) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 21 14:00:30 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 21 14:00:30 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 21 14:00:30 user nova-compute[71474]: DEBUG nova.virt.libvirt.vif [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T14:00:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesAdminNegativeTest-server-570817423',display_name='tempest-VolumesAdminNegativeTest-server-570817423',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesadminnegativetest-server-570817423',id=11,image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='15f83d6d2c3049e9ba1ac7f04ad2ebb0',ramdisk_id='',reservation_id='r-6lwxzn0d',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-VolumesAdminNegativeTest-1182596808',owner_user_name='tempest-VolumesAdminNegativeTest-1182596808-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T14:00:28Z,user_data=None,user_id='b60caf53ee58417cb76a77c963a45ec2',uuid=9164203a-8a6b-4078-bd98-c5ea7bc111fa,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "70605424-311e-401f-b769-3e037210f46a", "address": "fa:16:3e:b4:50:60", "network": {"id": "6e372a6f-6444-4977-be86-7a6bb86d8979", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-2058149994-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "15f83d6d2c3049e9ba1ac7f04ad2ebb0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap70605424-31", "ovs_interfaceid": "70605424-311e-401f-b769-3e037210f46a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71474) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 21 14:00:30 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Converting VIF {"id": "70605424-311e-401f-b769-3e037210f46a", "address": "fa:16:3e:b4:50:60", "network": {"id": "6e372a6f-6444-4977-be86-7a6bb86d8979", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-2058149994-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "15f83d6d2c3049e9ba1ac7f04ad2ebb0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap70605424-31", "ovs_interfaceid": "70605424-311e-401f-b769-3e037210f46a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 14:00:30 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b4:50:60,bridge_name='br-int',has_traffic_filtering=True,id=70605424-311e-401f-b769-3e037210f46a,network=Network(6e372a6f-6444-4977-be86-7a6bb86d8979),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap70605424-31') {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 14:00:30 user nova-compute[71474]: DEBUG nova.objects.instance [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Lazy-loading 'pci_devices' on Instance uuid 9164203a-8a6b-4078-bd98-c5ea7bc111fa {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 14:00:30 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] [instance: 9164203a-8a6b-4078-bd98-c5ea7bc111fa] End _get_guest_xml xml= Apr 21 14:00:30 user nova-compute[71474]: 9164203a-8a6b-4078-bd98-c5ea7bc111fa Apr 21 14:00:30 user nova-compute[71474]: instance-0000000b Apr 21 14:00:30 user nova-compute[71474]: 131072 Apr 21 14:00:30 user nova-compute[71474]: 1 Apr 21 14:00:30 user nova-compute[71474]: Apr 21 14:00:30 user nova-compute[71474]: Apr 21 14:00:30 user nova-compute[71474]: Apr 21 14:00:30 user nova-compute[71474]: tempest-VolumesAdminNegativeTest-server-570817423 Apr 21 14:00:30 user nova-compute[71474]: 2023-04-21 14:00:30 Apr 21 14:00:30 user nova-compute[71474]: Apr 21 14:00:30 user nova-compute[71474]: 128 Apr 21 14:00:30 user nova-compute[71474]: 1 Apr 21 14:00:30 user nova-compute[71474]: 0 Apr 21 14:00:30 user nova-compute[71474]: 0 Apr 21 14:00:30 user nova-compute[71474]: 1 Apr 21 14:00:30 user nova-compute[71474]: Apr 21 14:00:30 user nova-compute[71474]: Apr 21 14:00:30 user nova-compute[71474]: tempest-VolumesAdminNegativeTest-1182596808-project-member Apr 21 14:00:30 user nova-compute[71474]: tempest-VolumesAdminNegativeTest-1182596808 Apr 21 14:00:30 user nova-compute[71474]: Apr 21 14:00:30 user nova-compute[71474]: Apr 21 14:00:30 user nova-compute[71474]: Apr 21 14:00:30 user nova-compute[71474]: Apr 21 14:00:30 user nova-compute[71474]: Apr 21 14:00:30 user nova-compute[71474]: Apr 21 14:00:30 user nova-compute[71474]: Apr 21 14:00:30 user nova-compute[71474]: Apr 21 14:00:30 user nova-compute[71474]: Apr 21 14:00:30 user nova-compute[71474]: Apr 21 14:00:30 user nova-compute[71474]: Apr 21 14:00:30 user nova-compute[71474]: OpenStack Foundation Apr 21 14:00:30 user nova-compute[71474]: OpenStack Nova Apr 21 14:00:30 user nova-compute[71474]: 0.0.0 Apr 21 14:00:30 user nova-compute[71474]: 9164203a-8a6b-4078-bd98-c5ea7bc111fa Apr 21 14:00:30 user nova-compute[71474]: 9164203a-8a6b-4078-bd98-c5ea7bc111fa Apr 21 14:00:30 user nova-compute[71474]: Virtual Machine Apr 21 14:00:30 user nova-compute[71474]: Apr 21 14:00:30 user nova-compute[71474]: Apr 21 14:00:30 user nova-compute[71474]: Apr 21 14:00:30 user nova-compute[71474]: hvm Apr 21 14:00:30 user nova-compute[71474]: Apr 21 14:00:30 user nova-compute[71474]: Apr 21 14:00:30 user nova-compute[71474]: Apr 21 14:00:30 user nova-compute[71474]: Apr 21 14:00:30 user nova-compute[71474]: Apr 21 14:00:30 user nova-compute[71474]: Apr 21 14:00:30 user nova-compute[71474]: Apr 21 14:00:30 user nova-compute[71474]: Apr 21 14:00:30 user nova-compute[71474]: Apr 21 14:00:30 user nova-compute[71474]: Apr 21 14:00:30 user nova-compute[71474]: Apr 21 14:00:30 user nova-compute[71474]: Apr 21 14:00:30 user nova-compute[71474]: Apr 21 14:00:30 user nova-compute[71474]: Apr 21 14:00:30 user nova-compute[71474]: Nehalem Apr 21 14:00:30 user nova-compute[71474]: Apr 21 14:00:30 user nova-compute[71474]: Apr 21 14:00:30 user nova-compute[71474]: Apr 21 14:00:30 user nova-compute[71474]: Apr 21 14:00:30 user nova-compute[71474]: Apr 21 14:00:30 user nova-compute[71474]: Apr 21 14:00:30 user nova-compute[71474]: Apr 21 14:00:30 user nova-compute[71474]: Apr 21 14:00:30 user nova-compute[71474]: Apr 21 14:00:30 user nova-compute[71474]: Apr 21 14:00:30 user nova-compute[71474]: Apr 21 14:00:30 user nova-compute[71474]: Apr 21 14:00:30 user nova-compute[71474]: Apr 21 14:00:30 user nova-compute[71474]: Apr 21 14:00:30 user nova-compute[71474]: Apr 21 14:00:30 user nova-compute[71474]: Apr 21 14:00:30 user nova-compute[71474]: Apr 21 14:00:30 user nova-compute[71474]: Apr 21 14:00:30 user nova-compute[71474]: Apr 21 14:00:30 user nova-compute[71474]: Apr 21 14:00:30 user nova-compute[71474]: /dev/urandom Apr 21 14:00:30 user nova-compute[71474]: Apr 21 14:00:30 user nova-compute[71474]: Apr 21 14:00:30 user nova-compute[71474]: Apr 21 14:00:30 user nova-compute[71474]: Apr 21 14:00:30 user nova-compute[71474]: Apr 21 14:00:30 user nova-compute[71474]: Apr 21 14:00:30 user nova-compute[71474]: Apr 21 14:00:30 user nova-compute[71474]: {{(pid=71474) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 21 14:00:30 user nova-compute[71474]: DEBUG nova.virt.libvirt.vif [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T14:00:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesAdminNegativeTest-server-570817423',display_name='tempest-VolumesAdminNegativeTest-server-570817423',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesadminnegativetest-server-570817423',id=11,image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='15f83d6d2c3049e9ba1ac7f04ad2ebb0',ramdisk_id='',reservation_id='r-6lwxzn0d',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-VolumesAdminNegativeTest-1182596808',owner_user_name='tempest-VolumesAdminNegativeTest-1182596808-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T14:00:28Z,user_data=None,user_id='b60caf53ee58417cb76a77c963a45ec2',uuid=9164203a-8a6b-4078-bd98-c5ea7bc111fa,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "70605424-311e-401f-b769-3e037210f46a", "address": "fa:16:3e:b4:50:60", "network": {"id": "6e372a6f-6444-4977-be86-7a6bb86d8979", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-2058149994-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "15f83d6d2c3049e9ba1ac7f04ad2ebb0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap70605424-31", "ovs_interfaceid": "70605424-311e-401f-b769-3e037210f46a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 21 14:00:30 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Converting VIF {"id": "70605424-311e-401f-b769-3e037210f46a", "address": "fa:16:3e:b4:50:60", "network": {"id": "6e372a6f-6444-4977-be86-7a6bb86d8979", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-2058149994-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "15f83d6d2c3049e9ba1ac7f04ad2ebb0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap70605424-31", "ovs_interfaceid": "70605424-311e-401f-b769-3e037210f46a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 14:00:30 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b4:50:60,bridge_name='br-int',has_traffic_filtering=True,id=70605424-311e-401f-b769-3e037210f46a,network=Network(6e372a6f-6444-4977-be86-7a6bb86d8979),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap70605424-31') {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 14:00:30 user nova-compute[71474]: DEBUG os_vif [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b4:50:60,bridge_name='br-int',has_traffic_filtering=True,id=70605424-311e-401f-b769-3e037210f46a,network=Network(6e372a6f-6444-4977-be86-7a6bb86d8979),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap70605424-31') {{(pid=71474) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 21 14:00:30 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:00:30 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 14:00:30 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 21 14:00:30 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:00:30 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap70605424-31, may_exist=True) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 14:00:30 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap70605424-31, col_values=(('external_ids', {'iface-id': '70605424-311e-401f-b769-3e037210f46a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b4:50:60', 'vm-uuid': '9164203a-8a6b-4078-bd98-c5ea7bc111fa'}),)) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 14:00:30 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:00:30 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 14:00:30 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:00:30 user nova-compute[71474]: INFO os_vif [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b4:50:60,bridge_name='br-int',has_traffic_filtering=True,id=70605424-311e-401f-b769-3e037210f46a,network=Network(6e372a6f-6444-4977-be86-7a6bb86d8979),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap70605424-31') Apr 21 14:00:30 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] No BDM found with device name vda, not building metadata. {{(pid=71474) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 21 14:00:30 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] No VIF found with MAC fa:16:3e:b4:50:60, not building metadata {{(pid=71474) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 21 14:00:30 user nova-compute[71474]: DEBUG nova.network.neutron [req-ff75b5d5-7fa9-4462-bffa-6ca45e199d0a req-b7c6e3b5-976a-4318-9daf-f9db39dc7a72 service nova] [instance: 9164203a-8a6b-4078-bd98-c5ea7bc111fa] Updated VIF entry in instance network info cache for port 70605424-311e-401f-b769-3e037210f46a. {{(pid=71474) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 21 14:00:30 user nova-compute[71474]: DEBUG nova.network.neutron [req-ff75b5d5-7fa9-4462-bffa-6ca45e199d0a req-b7c6e3b5-976a-4318-9daf-f9db39dc7a72 service nova] [instance: 9164203a-8a6b-4078-bd98-c5ea7bc111fa] Updating instance_info_cache with network_info: [{"id": "70605424-311e-401f-b769-3e037210f46a", "address": "fa:16:3e:b4:50:60", "network": {"id": "6e372a6f-6444-4977-be86-7a6bb86d8979", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-2058149994-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "15f83d6d2c3049e9ba1ac7f04ad2ebb0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap70605424-31", "ovs_interfaceid": "70605424-311e-401f-b769-3e037210f46a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:00:30 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-ff75b5d5-7fa9-4462-bffa-6ca45e199d0a req-b7c6e3b5-976a-4318-9daf-f9db39dc7a72 service nova] Releasing lock "refresh_cache-9164203a-8a6b-4078-bd98-c5ea7bc111fa" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 14:00:30 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:00:31 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:00:31 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:00:31 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:00:31 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:00:31 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:00:32 user nova-compute[71474]: DEBUG nova.compute.manager [req-3409afc9-ec49-4599-9b0b-6f74bde02ab3 req-cd644b12-4abb-4732-aefe-92703cd5dc4d service nova] [instance: 9164203a-8a6b-4078-bd98-c5ea7bc111fa] Received event network-vif-plugged-70605424-311e-401f-b769-3e037210f46a {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:00:32 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-3409afc9-ec49-4599-9b0b-6f74bde02ab3 req-cd644b12-4abb-4732-aefe-92703cd5dc4d service nova] Acquiring lock "9164203a-8a6b-4078-bd98-c5ea7bc111fa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:00:32 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-3409afc9-ec49-4599-9b0b-6f74bde02ab3 req-cd644b12-4abb-4732-aefe-92703cd5dc4d service nova] Lock "9164203a-8a6b-4078-bd98-c5ea7bc111fa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:00:32 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-3409afc9-ec49-4599-9b0b-6f74bde02ab3 req-cd644b12-4abb-4732-aefe-92703cd5dc4d service nova] Lock "9164203a-8a6b-4078-bd98-c5ea7bc111fa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:00:32 user nova-compute[71474]: DEBUG nova.compute.manager [req-3409afc9-ec49-4599-9b0b-6f74bde02ab3 req-cd644b12-4abb-4732-aefe-92703cd5dc4d service nova] [instance: 9164203a-8a6b-4078-bd98-c5ea7bc111fa] No waiting events found dispatching network-vif-plugged-70605424-311e-401f-b769-3e037210f46a {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:00:32 user nova-compute[71474]: WARNING nova.compute.manager [req-3409afc9-ec49-4599-9b0b-6f74bde02ab3 req-cd644b12-4abb-4732-aefe-92703cd5dc4d service nova] [instance: 9164203a-8a6b-4078-bd98-c5ea7bc111fa] Received unexpected event network-vif-plugged-70605424-311e-401f-b769-3e037210f46a for instance with vm_state building and task_state spawning. Apr 21 14:00:33 user nova-compute[71474]: DEBUG nova.virt.driver [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Emitting event Resumed> {{(pid=71474) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 14:00:33 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 9164203a-8a6b-4078-bd98-c5ea7bc111fa] VM Resumed (Lifecycle Event) Apr 21 14:00:33 user nova-compute[71474]: DEBUG nova.compute.manager [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] [instance: 9164203a-8a6b-4078-bd98-c5ea7bc111fa] Instance event wait completed in 0 seconds for {{(pid=71474) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 21 14:00:33 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] [instance: 9164203a-8a6b-4078-bd98-c5ea7bc111fa] Guest created on hypervisor {{(pid=71474) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 21 14:00:33 user nova-compute[71474]: INFO nova.virt.libvirt.driver [-] [instance: 9164203a-8a6b-4078-bd98-c5ea7bc111fa] Instance spawned successfully. Apr 21 14:00:33 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] [instance: 9164203a-8a6b-4078-bd98-c5ea7bc111fa] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 21 14:00:33 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 9164203a-8a6b-4078-bd98-c5ea7bc111fa] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 14:00:33 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 9164203a-8a6b-4078-bd98-c5ea7bc111fa] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71474) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 14:00:33 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] [instance: 9164203a-8a6b-4078-bd98-c5ea7bc111fa] Found default for hw_cdrom_bus of ide {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:00:33 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] [instance: 9164203a-8a6b-4078-bd98-c5ea7bc111fa] Found default for hw_disk_bus of virtio {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:00:33 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] [instance: 9164203a-8a6b-4078-bd98-c5ea7bc111fa] Found default for hw_input_bus of None {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:00:33 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] [instance: 9164203a-8a6b-4078-bd98-c5ea7bc111fa] Found default for hw_pointer_model of None {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:00:33 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] [instance: 9164203a-8a6b-4078-bd98-c5ea7bc111fa] Found default for hw_video_model of virtio {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:00:33 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] [instance: 9164203a-8a6b-4078-bd98-c5ea7bc111fa] Found default for hw_vif_model of virtio {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:00:33 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 9164203a-8a6b-4078-bd98-c5ea7bc111fa] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 14:00:33 user nova-compute[71474]: DEBUG nova.virt.driver [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Emitting event Started> {{(pid=71474) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 14:00:33 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 9164203a-8a6b-4078-bd98-c5ea7bc111fa] VM Started (Lifecycle Event) Apr 21 14:00:33 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 9164203a-8a6b-4078-bd98-c5ea7bc111fa] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 14:00:33 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 9164203a-8a6b-4078-bd98-c5ea7bc111fa] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71474) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 14:00:33 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 9164203a-8a6b-4078-bd98-c5ea7bc111fa] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 14:00:33 user nova-compute[71474]: INFO nova.compute.manager [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] [instance: 9164203a-8a6b-4078-bd98-c5ea7bc111fa] Took 5.50 seconds to spawn the instance on the hypervisor. Apr 21 14:00:33 user nova-compute[71474]: DEBUG nova.compute.manager [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] [instance: 9164203a-8a6b-4078-bd98-c5ea7bc111fa] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 14:00:33 user nova-compute[71474]: INFO nova.compute.manager [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] [instance: 9164203a-8a6b-4078-bd98-c5ea7bc111fa] Took 6.17 seconds to build instance. Apr 21 14:00:33 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f4e02dfc-bca3-4a98-a8c3-d3a7a4c3c6ae tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Lock "9164203a-8a6b-4078-bd98-c5ea7bc111fa" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.269s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:00:34 user nova-compute[71474]: DEBUG nova.compute.manager [req-062ed02e-497c-417a-a41c-3035fd63bca1 req-1a776488-cbe6-498c-9aa3-4b6e06489013 service nova] [instance: 9164203a-8a6b-4078-bd98-c5ea7bc111fa] Received event network-vif-plugged-70605424-311e-401f-b769-3e037210f46a {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:00:34 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-062ed02e-497c-417a-a41c-3035fd63bca1 req-1a776488-cbe6-498c-9aa3-4b6e06489013 service nova] Acquiring lock "9164203a-8a6b-4078-bd98-c5ea7bc111fa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:00:34 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-062ed02e-497c-417a-a41c-3035fd63bca1 req-1a776488-cbe6-498c-9aa3-4b6e06489013 service nova] Lock "9164203a-8a6b-4078-bd98-c5ea7bc111fa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:00:34 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-062ed02e-497c-417a-a41c-3035fd63bca1 req-1a776488-cbe6-498c-9aa3-4b6e06489013 service nova] Lock "9164203a-8a6b-4078-bd98-c5ea7bc111fa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:00:34 user nova-compute[71474]: DEBUG nova.compute.manager [req-062ed02e-497c-417a-a41c-3035fd63bca1 req-1a776488-cbe6-498c-9aa3-4b6e06489013 service nova] [instance: 9164203a-8a6b-4078-bd98-c5ea7bc111fa] No waiting events found dispatching network-vif-plugged-70605424-311e-401f-b769-3e037210f46a {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:00:34 user nova-compute[71474]: WARNING nova.compute.manager [req-062ed02e-497c-417a-a41c-3035fd63bca1 req-1a776488-cbe6-498c-9aa3-4b6e06489013 service nova] [instance: 9164203a-8a6b-4078-bd98-c5ea7bc111fa] Received unexpected event network-vif-plugged-70605424-311e-401f-b769-3e037210f46a for instance with vm_state active and task_state None. Apr 21 14:00:35 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:00:35 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:00:39 user nova-compute[71474]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71474) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 14:00:39 user nova-compute[71474]: INFO nova.compute.manager [-] [instance: 0346fbd8-64cd-45e7-906f-e00eeece91ce] VM Stopped (Lifecycle Event) Apr 21 14:00:39 user nova-compute[71474]: DEBUG nova.compute.manager [None req-33ff56e3-1ca0-448d-9a2b-8e249fe78ded None None] [instance: 0346fbd8-64cd-45e7-906f-e00eeece91ce] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 14:00:40 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:00:40 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:00:45 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:00:45 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:00:46 user nova-compute[71474]: DEBUG nova.compute.manager [None req-f3ece249-25a0-494e-9f7d-dc3c1beb613c tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 14:00:46 user nova-compute[71474]: INFO nova.compute.manager [None req-f3ece249-25a0-494e-9f7d-dc3c1beb613c tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] instance snapshotting Apr 21 14:00:46 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-f3ece249-25a0-494e-9f7d-dc3c1beb613c tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] Beginning live snapshot process Apr 21 14:00:47 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-f3ece249-25a0-494e-9f7d-dc3c1beb613c tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5e502c4c-a46b-4670-acba-2fda2d05adf5/disk --force-share --output=json -f qcow2 {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:00:47 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-f3ece249-25a0-494e-9f7d-dc3c1beb613c tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5e502c4c-a46b-4670-acba-2fda2d05adf5/disk --force-share --output=json -f qcow2" returned: 0 in 0.138s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:00:47 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-f3ece249-25a0-494e-9f7d-dc3c1beb613c tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5e502c4c-a46b-4670-acba-2fda2d05adf5/disk --force-share --output=json -f qcow2 {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:00:47 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-f3ece249-25a0-494e-9f7d-dc3c1beb613c tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5e502c4c-a46b-4670-acba-2fda2d05adf5/disk --force-share --output=json -f qcow2" returned: 0 in 0.131s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:00:47 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-f3ece249-25a0-494e-9f7d-dc3c1beb613c tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:00:47 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-f3ece249-25a0-494e-9f7d-dc3c1beb613c tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json" returned: 0 in 0.129s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:00:47 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-f3ece249-25a0-494e-9f7d-dc3c1beb613c tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7,backing_fmt=raw /opt/stack/data/nova/instances/snapshots/tmptw3zfym8/f27e4a6de5c5493da273bca5fa9258e4.delta 1073741824 {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:00:47 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-f3ece249-25a0-494e-9f7d-dc3c1beb613c tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7,backing_fmt=raw /opt/stack/data/nova/instances/snapshots/tmptw3zfym8/f27e4a6de5c5493da273bca5fa9258e4.delta 1073741824" returned: 0 in 0.046s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:00:47 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-f3ece249-25a0-494e-9f7d-dc3c1beb613c tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] Quiescing instance not available: QEMU guest agent is not enabled. Apr 21 14:00:48 user nova-compute[71474]: DEBUG nova.virt.libvirt.guest [None req-f3ece249-25a0-494e-9f7d-dc3c1beb613c tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] COPY block job progress, current cursor: 0 final cursor: 43778048 {{(pid=71474) is_job_complete /opt/stack/nova/nova/virt/libvirt/guest.py:846}} Apr 21 14:00:48 user nova-compute[71474]: DEBUG nova.virt.libvirt.guest [None req-f3ece249-25a0-494e-9f7d-dc3c1beb613c tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] COPY block job progress, current cursor: 43778048 final cursor: 43778048 {{(pid=71474) is_job_complete /opt/stack/nova/nova/virt/libvirt/guest.py:846}} Apr 21 14:00:48 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-f3ece249-25a0-494e-9f7d-dc3c1beb613c tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] Skipping quiescing instance: QEMU guest agent is not enabled. Apr 21 14:00:48 user nova-compute[71474]: DEBUG nova.privsep.utils [None req-f3ece249-25a0-494e-9f7d-dc3c1beb613c tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=71474) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Apr 21 14:00:48 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-f3ece249-25a0-494e-9f7d-dc3c1beb613c tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /opt/stack/data/nova/instances/snapshots/tmptw3zfym8/f27e4a6de5c5493da273bca5fa9258e4.delta /opt/stack/data/nova/instances/snapshots/tmptw3zfym8/f27e4a6de5c5493da273bca5fa9258e4 {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:00:49 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-f3ece249-25a0-494e-9f7d-dc3c1beb613c tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /opt/stack/data/nova/instances/snapshots/tmptw3zfym8/f27e4a6de5c5493da273bca5fa9258e4.delta /opt/stack/data/nova/instances/snapshots/tmptw3zfym8/f27e4a6de5c5493da273bca5fa9258e4" returned: 0 in 0.336s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:00:49 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-f3ece249-25a0-494e-9f7d-dc3c1beb613c tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] Snapshot extracted, beginning image upload Apr 21 14:00:49 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:00:50 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:00:50 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:00:51 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-f3ece249-25a0-494e-9f7d-dc3c1beb613c tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] Snapshot image upload complete Apr 21 14:00:51 user nova-compute[71474]: INFO nova.compute.manager [None req-f3ece249-25a0-494e-9f7d-dc3c1beb613c tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] Took 4.64 seconds to snapshot the instance on the hypervisor. Apr 21 14:00:51 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Acquiring lock "ef0a7b15-eab4-4705-9f70-9c9117736eb1" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:00:51 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Lock "ef0a7b15-eab4-4705-9f70-9c9117736eb1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:00:51 user nova-compute[71474]: DEBUG nova.compute.manager [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] Starting instance... {{(pid=71474) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 21 14:00:51 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:00:51 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:00:51 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71474) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 21 14:00:51 user nova-compute[71474]: INFO nova.compute.claims [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] Claim successful on node user Apr 21 14:00:51 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:00:51 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Inventory has not changed in ProviderTree for provider: 4e62c1ab-67bb-43ed-8389-61deb50e98d7 {{(pid=71474) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 14:00:51 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Inventory has not changed for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 14:00:51 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.355s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:00:51 user nova-compute[71474]: DEBUG nova.compute.manager [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] Start building networks asynchronously for instance. {{(pid=71474) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 21 14:00:51 user nova-compute[71474]: DEBUG nova.compute.manager [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] Allocating IP information in the background. {{(pid=71474) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 21 14:00:51 user nova-compute[71474]: DEBUG nova.network.neutron [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] allocate_for_instance() {{(pid=71474) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 21 14:00:51 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 21 14:00:52 user nova-compute[71474]: DEBUG nova.compute.manager [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] Start building block device mappings for instance. {{(pid=71474) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 21 14:00:52 user nova-compute[71474]: DEBUG nova.compute.manager [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] Start spawning the instance on the hypervisor. {{(pid=71474) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 21 14:00:52 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] Creating instance directory {{(pid=71474) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 21 14:00:52 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] Creating image(s) Apr 21 14:00:52 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Acquiring lock "/opt/stack/data/nova/instances/ef0a7b15-eab4-4705-9f70-9c9117736eb1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:00:52 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Lock "/opt/stack/data/nova/instances/ef0a7b15-eab4-4705-9f70-9c9117736eb1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:00:52 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Lock "/opt/stack/data/nova/instances/ef0a7b15-eab4-4705-9f70-9c9117736eb1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:00:52 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:00:52 user nova-compute[71474]: DEBUG nova.policy [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '113884844de14ec7ac8a20ba06a389b3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '91f5972380fd48eabffd46e6727239ce', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71474) authorize /opt/stack/nova/nova/policy.py:203}} Apr 21 14:00:52 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json" returned: 0 in 0.149s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:00:52 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Acquiring lock "8e8c288cb98f22f6af31ad55f38b7baa81c260d7" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:00:52 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Lock "8e8c288cb98f22f6af31ad55f38b7baa81c260d7" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:00:52 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:00:52 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json" returned: 0 in 0.132s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:00:52 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7,backing_fmt=raw /opt/stack/data/nova/instances/ef0a7b15-eab4-4705-9f70-9c9117736eb1/disk 1073741824 {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:00:52 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7,backing_fmt=raw /opt/stack/data/nova/instances/ef0a7b15-eab4-4705-9f70-9c9117736eb1/disk 1073741824" returned: 0 in 0.048s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:00:52 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Lock "8e8c288cb98f22f6af31ad55f38b7baa81c260d7" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.186s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:00:52 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:00:52 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json" returned: 0 in 0.132s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:00:52 user nova-compute[71474]: DEBUG nova.virt.disk.api [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Checking if we can resize image /opt/stack/data/nova/instances/ef0a7b15-eab4-4705-9f70-9c9117736eb1/disk. size=1073741824 {{(pid=71474) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 21 14:00:52 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ef0a7b15-eab4-4705-9f70-9c9117736eb1/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:00:52 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-e11dd171-27dc-4819-b654-49d63347a914 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] Acquiring lock "90591d9b-6d6b-4f22-a3dc-fd83044df26b" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:00:52 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-e11dd171-27dc-4819-b654-49d63347a914 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] Lock "90591d9b-6d6b-4f22-a3dc-fd83044df26b" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:00:52 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-e11dd171-27dc-4819-b654-49d63347a914 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] Acquiring lock "90591d9b-6d6b-4f22-a3dc-fd83044df26b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:00:52 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-e11dd171-27dc-4819-b654-49d63347a914 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] Lock "90591d9b-6d6b-4f22-a3dc-fd83044df26b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:00:52 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-e11dd171-27dc-4819-b654-49d63347a914 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] Lock "90591d9b-6d6b-4f22-a3dc-fd83044df26b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:00:52 user nova-compute[71474]: INFO nova.compute.manager [None req-e11dd171-27dc-4819-b654-49d63347a914 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] [instance: 90591d9b-6d6b-4f22-a3dc-fd83044df26b] Terminating instance Apr 21 14:00:52 user nova-compute[71474]: DEBUG nova.compute.manager [None req-e11dd171-27dc-4819-b654-49d63347a914 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] [instance: 90591d9b-6d6b-4f22-a3dc-fd83044df26b] Start destroying the instance on the hypervisor. {{(pid=71474) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 21 14:00:52 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ef0a7b15-eab4-4705-9f70-9c9117736eb1/disk --force-share --output=json" returned: 0 in 0.132s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:00:52 user nova-compute[71474]: DEBUG nova.virt.disk.api [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Cannot resize image /opt/stack/data/nova/instances/ef0a7b15-eab4-4705-9f70-9c9117736eb1/disk to a smaller size. {{(pid=71474) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 21 14:00:52 user nova-compute[71474]: DEBUG nova.objects.instance [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Lazy-loading 'migration_context' on Instance uuid ef0a7b15-eab4-4705-9f70-9c9117736eb1 {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 14:00:52 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] Created local disks {{(pid=71474) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 21 14:00:52 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] Ensure instance console log exists: /opt/stack/data/nova/instances/ef0a7b15-eab4-4705-9f70-9c9117736eb1/console.log {{(pid=71474) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 21 14:00:52 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:00:52 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:00:52 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:00:52 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:00:52 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:00:52 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Starting heal instance info cache {{(pid=71474) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 21 14:00:52 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:00:52 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:00:52 user nova-compute[71474]: DEBUG nova.compute.manager [req-4b614965-5cfa-4604-bf93-70cf202b8a94 req-8979c528-687c-43e2-b350-b9a394ef7192 service nova] [instance: 90591d9b-6d6b-4f22-a3dc-fd83044df26b] Received event network-vif-unplugged-806c5236-f27a-4616-9ec6-85a2585f753a {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:00:52 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-4b614965-5cfa-4604-bf93-70cf202b8a94 req-8979c528-687c-43e2-b350-b9a394ef7192 service nova] Acquiring lock "90591d9b-6d6b-4f22-a3dc-fd83044df26b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:00:52 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-4b614965-5cfa-4604-bf93-70cf202b8a94 req-8979c528-687c-43e2-b350-b9a394ef7192 service nova] Lock "90591d9b-6d6b-4f22-a3dc-fd83044df26b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:00:52 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-4b614965-5cfa-4604-bf93-70cf202b8a94 req-8979c528-687c-43e2-b350-b9a394ef7192 service nova] Lock "90591d9b-6d6b-4f22-a3dc-fd83044df26b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:00:52 user nova-compute[71474]: DEBUG nova.compute.manager [req-4b614965-5cfa-4604-bf93-70cf202b8a94 req-8979c528-687c-43e2-b350-b9a394ef7192 service nova] [instance: 90591d9b-6d6b-4f22-a3dc-fd83044df26b] No waiting events found dispatching network-vif-unplugged-806c5236-f27a-4616-9ec6-85a2585f753a {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:00:52 user nova-compute[71474]: DEBUG nova.compute.manager [req-4b614965-5cfa-4604-bf93-70cf202b8a94 req-8979c528-687c-43e2-b350-b9a394ef7192 service nova] [instance: 90591d9b-6d6b-4f22-a3dc-fd83044df26b] Received event network-vif-unplugged-806c5236-f27a-4616-9ec6-85a2585f753a for instance with task_state deleting. {{(pid=71474) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 21 14:00:53 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquiring lock "refresh_cache-2c5afe45-87ae-477a-8bf0-6a5e2036fb68" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 14:00:53 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquired lock "refresh_cache-2c5afe45-87ae-477a-8bf0-6a5e2036fb68" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 14:00:53 user nova-compute[71474]: DEBUG nova.network.neutron [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] Forcefully refreshing network info cache for instance {{(pid=71474) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 21 14:00:53 user nova-compute[71474]: DEBUG nova.network.neutron [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] Successfully created port: 662f0568-767d-4dd9-b220-2936b4d96745 {{(pid=71474) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 21 14:00:53 user nova-compute[71474]: INFO nova.virt.libvirt.driver [-] [instance: 90591d9b-6d6b-4f22-a3dc-fd83044df26b] Instance destroyed successfully. Apr 21 14:00:53 user nova-compute[71474]: DEBUG nova.objects.instance [None req-e11dd171-27dc-4819-b654-49d63347a914 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] Lazy-loading 'resources' on Instance uuid 90591d9b-6d6b-4f22-a3dc-fd83044df26b {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 14:00:53 user nova-compute[71474]: DEBUG nova.virt.libvirt.vif [None req-e11dd171-27dc-4819-b654-49d63347a914 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T13:59:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1763512323',display_name='tempest-DeleteServersTestJSON-server-1763512323',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-deleteserverstestjson-server-1763512323',id=8,image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-21T13:59:07Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='f4add6cbbd424513a25dacd5dfb3adcf',ramdisk_id='',reservation_id='r-hbsf4k4y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-DeleteServersTestJSON-356048122',owner_user_name='tempest-DeleteServersTestJSON-356048122-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-21T13:59:08Z,user_data=None,user_id='f09a36fbec134e248aa2baec4bb6a53b',uuid=90591d9b-6d6b-4f22-a3dc-fd83044df26b,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "806c5236-f27a-4616-9ec6-85a2585f753a", "address": "fa:16:3e:0e:86:45", "network": {"id": "d4871732-c28b-456f-8efe-8f0c46a4107d", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-86130370-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f4add6cbbd424513a25dacd5dfb3adcf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap806c5236-f2", "ovs_interfaceid": "806c5236-f27a-4616-9ec6-85a2585f753a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 21 14:00:53 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-e11dd171-27dc-4819-b654-49d63347a914 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] Converting VIF {"id": "806c5236-f27a-4616-9ec6-85a2585f753a", "address": "fa:16:3e:0e:86:45", "network": {"id": "d4871732-c28b-456f-8efe-8f0c46a4107d", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-86130370-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f4add6cbbd424513a25dacd5dfb3adcf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap806c5236-f2", "ovs_interfaceid": "806c5236-f27a-4616-9ec6-85a2585f753a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 14:00:53 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-e11dd171-27dc-4819-b654-49d63347a914 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0e:86:45,bridge_name='br-int',has_traffic_filtering=True,id=806c5236-f27a-4616-9ec6-85a2585f753a,network=Network(d4871732-c28b-456f-8efe-8f0c46a4107d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap806c5236-f2') {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 14:00:53 user nova-compute[71474]: DEBUG os_vif [None req-e11dd171-27dc-4819-b654-49d63347a914 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0e:86:45,bridge_name='br-int',has_traffic_filtering=True,id=806c5236-f27a-4616-9ec6-85a2585f753a,network=Network(d4871732-c28b-456f-8efe-8f0c46a4107d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap806c5236-f2') {{(pid=71474) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 21 14:00:53 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:00:53 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap806c5236-f2, bridge=br-int, if_exists=True) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 14:00:53 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:00:53 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:00:53 user nova-compute[71474]: INFO os_vif [None req-e11dd171-27dc-4819-b654-49d63347a914 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0e:86:45,bridge_name='br-int',has_traffic_filtering=True,id=806c5236-f27a-4616-9ec6-85a2585f753a,network=Network(d4871732-c28b-456f-8efe-8f0c46a4107d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap806c5236-f2') Apr 21 14:00:53 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-e11dd171-27dc-4819-b654-49d63347a914 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] [instance: 90591d9b-6d6b-4f22-a3dc-fd83044df26b] Deleting instance files /opt/stack/data/nova/instances/90591d9b-6d6b-4f22-a3dc-fd83044df26b_del Apr 21 14:00:53 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-e11dd171-27dc-4819-b654-49d63347a914 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] [instance: 90591d9b-6d6b-4f22-a3dc-fd83044df26b] Deletion of /opt/stack/data/nova/instances/90591d9b-6d6b-4f22-a3dc-fd83044df26b_del complete Apr 21 14:00:53 user nova-compute[71474]: INFO nova.compute.manager [None req-e11dd171-27dc-4819-b654-49d63347a914 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] [instance: 90591d9b-6d6b-4f22-a3dc-fd83044df26b] Took 0.85 seconds to destroy the instance on the hypervisor. Apr 21 14:00:53 user nova-compute[71474]: DEBUG oslo.service.loopingcall [None req-e11dd171-27dc-4819-b654-49d63347a914 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71474) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 21 14:00:53 user nova-compute[71474]: DEBUG nova.compute.manager [-] [instance: 90591d9b-6d6b-4f22-a3dc-fd83044df26b] Deallocating network for instance {{(pid=71474) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 21 14:00:53 user nova-compute[71474]: DEBUG nova.network.neutron [-] [instance: 90591d9b-6d6b-4f22-a3dc-fd83044df26b] deallocate_for_instance() {{(pid=71474) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 21 14:00:53 user nova-compute[71474]: DEBUG nova.network.neutron [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] Updating instance_info_cache with network_info: [{"id": "2616f5a4-1b53-44bd-82ad-65419e2839ca", "address": "fa:16:3e:52:41:c0", "network": {"id": "43525cbd-9d02-4cd7-b457-b26a485106f5", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1453701117-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.197", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "b4c4270d6dfa435f94da018d12586bcd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2616f5a4-1b", "ovs_interfaceid": "2616f5a4-1b53-44bd-82ad-65419e2839ca", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:00:53 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Releasing lock "refresh_cache-2c5afe45-87ae-477a-8bf0-6a5e2036fb68" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 14:00:53 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] Updated the network info_cache for instance {{(pid=71474) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 21 14:00:53 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:00:53 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:00:53 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:00:53 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:00:53 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:00:53 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Auditing locally available compute resources for user (node: user) {{(pid=71474) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 21 14:00:53 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/9164203a-8a6b-4078-bd98-c5ea7bc111fa/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:00:53 user nova-compute[71474]: DEBUG nova.network.neutron [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] Successfully updated port: 662f0568-767d-4dd9-b220-2936b4d96745 {{(pid=71474) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 21 14:00:53 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Acquiring lock "refresh_cache-ef0a7b15-eab4-4705-9f70-9c9117736eb1" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 14:00:53 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Acquired lock "refresh_cache-ef0a7b15-eab4-4705-9f70-9c9117736eb1" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 14:00:53 user nova-compute[71474]: DEBUG nova.network.neutron [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] Building network info cache for instance {{(pid=71474) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 21 14:00:53 user nova-compute[71474]: DEBUG nova.compute.manager [req-84a07417-c219-4078-8117-555e40733684 req-6b8ed968-67b8-4086-a487-378fb1e9246b service nova] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] Received event network-changed-662f0568-767d-4dd9-b220-2936b4d96745 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:00:53 user nova-compute[71474]: DEBUG nova.compute.manager [req-84a07417-c219-4078-8117-555e40733684 req-6b8ed968-67b8-4086-a487-378fb1e9246b service nova] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] Refreshing instance network info cache due to event network-changed-662f0568-767d-4dd9-b220-2936b4d96745. {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 21 14:00:53 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-84a07417-c219-4078-8117-555e40733684 req-6b8ed968-67b8-4086-a487-378fb1e9246b service nova] Acquiring lock "refresh_cache-ef0a7b15-eab4-4705-9f70-9c9117736eb1" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 14:00:53 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/9164203a-8a6b-4078-bd98-c5ea7bc111fa/disk --force-share --output=json" returned: 0 in 0.149s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:00:53 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/9164203a-8a6b-4078-bd98-c5ea7bc111fa/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:00:53 user nova-compute[71474]: DEBUG nova.network.neutron [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] Instance cache missing network info. {{(pid=71474) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 21 14:00:54 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/9164203a-8a6b-4078-bd98-c5ea7bc111fa/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:00:54 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2ae07df3-4bf4-44a5-a772-3507a6dde6ab/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:00:54 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2ae07df3-4bf4-44a5-a772-3507a6dde6ab/disk --force-share --output=json" returned: 0 in 0.141s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:00:54 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2ae07df3-4bf4-44a5-a772-3507a6dde6ab/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:00:54 user nova-compute[71474]: DEBUG nova.network.neutron [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] Updating instance_info_cache with network_info: [{"id": "662f0568-767d-4dd9-b220-2936b4d96745", "address": "fa:16:3e:bf:9c:e9", "network": {"id": "23a0f330-371d-4fe5-befe-bc4147bf09c7", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-656541543-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "91f5972380fd48eabffd46e6727239ce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap662f0568-76", "ovs_interfaceid": "662f0568-767d-4dd9-b220-2936b4d96745", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:00:54 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Releasing lock "refresh_cache-ef0a7b15-eab4-4705-9f70-9c9117736eb1" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 14:00:54 user nova-compute[71474]: DEBUG nova.compute.manager [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] Instance network_info: |[{"id": "662f0568-767d-4dd9-b220-2936b4d96745", "address": "fa:16:3e:bf:9c:e9", "network": {"id": "23a0f330-371d-4fe5-befe-bc4147bf09c7", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-656541543-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "91f5972380fd48eabffd46e6727239ce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap662f0568-76", "ovs_interfaceid": "662f0568-767d-4dd9-b220-2936b4d96745", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71474) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 21 14:00:54 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-84a07417-c219-4078-8117-555e40733684 req-6b8ed968-67b8-4086-a487-378fb1e9246b service nova] Acquired lock "refresh_cache-ef0a7b15-eab4-4705-9f70-9c9117736eb1" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 14:00:54 user nova-compute[71474]: DEBUG nova.network.neutron [req-84a07417-c219-4078-8117-555e40733684 req-6b8ed968-67b8-4086-a487-378fb1e9246b service nova] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] Refreshing network info cache for port 662f0568-767d-4dd9-b220-2936b4d96745 {{(pid=71474) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 21 14:00:54 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] Start _get_guest_xml network_info=[{"id": "662f0568-767d-4dd9-b220-2936b4d96745", "address": "fa:16:3e:bf:9c:e9", "network": {"id": "23a0f330-371d-4fe5-befe-bc4147bf09c7", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-656541543-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "91f5972380fd48eabffd46e6727239ce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap662f0568-76", "ovs_interfaceid": "662f0568-767d-4dd9-b220-2936b4d96745", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T13:54:16Z,direct_url=,disk_format='qcow2',id=2edfef44-2867-4e03-a53e-b139f99afa75,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='36a44032fda748c1965c722304fa176d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T13:54:18Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'guest_format': None, 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'image_id': '2edfef44-2867-4e03-a53e-b139f99afa75'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71474) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 21 14:00:54 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 14:00:54 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 14:00:54 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71474) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 21 14:00:54 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-21T13:55:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T13:54:16Z,direct_url=,disk_format='qcow2',id=2edfef44-2867-4e03-a53e-b139f99afa75,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='36a44032fda748c1965c722304fa176d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T13:54:18Z,virtual_size=,visibility=), allow threads: True {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 21 14:00:54 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Flavor limits 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 21 14:00:54 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Image limits 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 21 14:00:54 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Flavor pref 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 21 14:00:54 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Image pref 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 21 14:00:54 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 21 14:00:54 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 21 14:00:54 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71474) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 21 14:00:54 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Got 1 possible topologies {{(pid=71474) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 21 14:00:54 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 21 14:00:54 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 21 14:00:54 user nova-compute[71474]: DEBUG nova.virt.libvirt.vif [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T14:00:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-2136406868',display_name='tempest-AttachVolumeTestJSON-server-2136406868',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumetestjson-server-2136406868',id=12,image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNYfx0D/9HI6dRD4wXlpAeizOIf9VmC7glu2drchWSBPGsmZxukl0JKQLSPBGvOnDZea9iBw8HpwJNLK6oFXfEHEkUs1WkQz/KQVrF/Jrc/AnOokiNeEKYxBPPCAEmxZ0Q==',key_name='tempest-keypair-1493985608',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='91f5972380fd48eabffd46e6727239ce',ramdisk_id='',reservation_id='r-fvg0azuk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeTestJSON-1194238008',owner_user_name='tempest-AttachVolumeTestJSON-1194238008-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T14:00:52Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='113884844de14ec7ac8a20ba06a389b3',uuid=ef0a7b15-eab4-4705-9f70-9c9117736eb1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "662f0568-767d-4dd9-b220-2936b4d96745", "address": "fa:16:3e:bf:9c:e9", "network": {"id": "23a0f330-371d-4fe5-befe-bc4147bf09c7", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-656541543-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "91f5972380fd48eabffd46e6727239ce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap662f0568-76", "ovs_interfaceid": "662f0568-767d-4dd9-b220-2936b4d96745", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71474) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 21 14:00:54 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Converting VIF {"id": "662f0568-767d-4dd9-b220-2936b4d96745", "address": "fa:16:3e:bf:9c:e9", "network": {"id": "23a0f330-371d-4fe5-befe-bc4147bf09c7", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-656541543-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "91f5972380fd48eabffd46e6727239ce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap662f0568-76", "ovs_interfaceid": "662f0568-767d-4dd9-b220-2936b4d96745", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 14:00:54 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bf:9c:e9,bridge_name='br-int',has_traffic_filtering=True,id=662f0568-767d-4dd9-b220-2936b4d96745,network=Network(23a0f330-371d-4fe5-befe-bc4147bf09c7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap662f0568-76') {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 14:00:54 user nova-compute[71474]: DEBUG nova.objects.instance [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Lazy-loading 'pci_devices' on Instance uuid ef0a7b15-eab4-4705-9f70-9c9117736eb1 {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 14:00:54 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2ae07df3-4bf4-44a5-a772-3507a6dde6ab/disk --force-share --output=json" returned: 0 in 0.151s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:00:54 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4f8622ba-dea6-454f-90c8-1f5f6a56e0b4/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:00:54 user nova-compute[71474]: DEBUG nova.compute.manager [req-8fa92328-d77e-4109-a8f7-9f12bb3a5dad req-edcb7560-07d4-4e12-bf96-a1a874ae093a service nova] [instance: 90591d9b-6d6b-4f22-a3dc-fd83044df26b] Received event network-vif-deleted-806c5236-f27a-4616-9ec6-85a2585f753a {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:00:54 user nova-compute[71474]: INFO nova.compute.manager [req-8fa92328-d77e-4109-a8f7-9f12bb3a5dad req-edcb7560-07d4-4e12-bf96-a1a874ae093a service nova] [instance: 90591d9b-6d6b-4f22-a3dc-fd83044df26b] Neutron deleted interface 806c5236-f27a-4616-9ec6-85a2585f753a; detaching it from the instance and deleting it from the info cache Apr 21 14:00:54 user nova-compute[71474]: DEBUG nova.network.neutron [req-8fa92328-d77e-4109-a8f7-9f12bb3a5dad req-edcb7560-07d4-4e12-bf96-a1a874ae093a service nova] [instance: 90591d9b-6d6b-4f22-a3dc-fd83044df26b] Updating instance_info_cache with network_info: [] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:00:54 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] End _get_guest_xml xml= Apr 21 14:00:54 user nova-compute[71474]: ef0a7b15-eab4-4705-9f70-9c9117736eb1 Apr 21 14:00:54 user nova-compute[71474]: instance-0000000c Apr 21 14:00:54 user nova-compute[71474]: 131072 Apr 21 14:00:54 user nova-compute[71474]: 1 Apr 21 14:00:54 user nova-compute[71474]: Apr 21 14:00:54 user nova-compute[71474]: Apr 21 14:00:54 user nova-compute[71474]: Apr 21 14:00:54 user nova-compute[71474]: tempest-AttachVolumeTestJSON-server-2136406868 Apr 21 14:00:54 user nova-compute[71474]: 2023-04-21 14:00:54 Apr 21 14:00:54 user nova-compute[71474]: Apr 21 14:00:54 user nova-compute[71474]: 128 Apr 21 14:00:54 user nova-compute[71474]: 1 Apr 21 14:00:54 user nova-compute[71474]: 0 Apr 21 14:00:54 user nova-compute[71474]: 0 Apr 21 14:00:54 user nova-compute[71474]: 1 Apr 21 14:00:54 user nova-compute[71474]: Apr 21 14:00:54 user nova-compute[71474]: Apr 21 14:00:54 user nova-compute[71474]: tempest-AttachVolumeTestJSON-1194238008-project-member Apr 21 14:00:54 user nova-compute[71474]: tempest-AttachVolumeTestJSON-1194238008 Apr 21 14:00:54 user nova-compute[71474]: Apr 21 14:00:54 user nova-compute[71474]: Apr 21 14:00:54 user nova-compute[71474]: Apr 21 14:00:54 user nova-compute[71474]: Apr 21 14:00:54 user nova-compute[71474]: Apr 21 14:00:54 user nova-compute[71474]: Apr 21 14:00:54 user nova-compute[71474]: Apr 21 14:00:54 user nova-compute[71474]: Apr 21 14:00:54 user nova-compute[71474]: Apr 21 14:00:54 user nova-compute[71474]: Apr 21 14:00:54 user nova-compute[71474]: Apr 21 14:00:54 user nova-compute[71474]: OpenStack Foundation Apr 21 14:00:54 user nova-compute[71474]: OpenStack Nova Apr 21 14:00:54 user nova-compute[71474]: 0.0.0 Apr 21 14:00:54 user nova-compute[71474]: ef0a7b15-eab4-4705-9f70-9c9117736eb1 Apr 21 14:00:54 user nova-compute[71474]: ef0a7b15-eab4-4705-9f70-9c9117736eb1 Apr 21 14:00:54 user nova-compute[71474]: Virtual Machine Apr 21 14:00:54 user nova-compute[71474]: Apr 21 14:00:54 user nova-compute[71474]: Apr 21 14:00:54 user nova-compute[71474]: Apr 21 14:00:54 user nova-compute[71474]: hvm Apr 21 14:00:54 user nova-compute[71474]: Apr 21 14:00:54 user nova-compute[71474]: Apr 21 14:00:54 user nova-compute[71474]: Apr 21 14:00:54 user nova-compute[71474]: Apr 21 14:00:54 user nova-compute[71474]: Apr 21 14:00:54 user nova-compute[71474]: Apr 21 14:00:54 user nova-compute[71474]: Apr 21 14:00:54 user nova-compute[71474]: Apr 21 14:00:54 user nova-compute[71474]: Apr 21 14:00:54 user nova-compute[71474]: Apr 21 14:00:54 user nova-compute[71474]: Apr 21 14:00:54 user nova-compute[71474]: Apr 21 14:00:54 user nova-compute[71474]: Apr 21 14:00:54 user nova-compute[71474]: Apr 21 14:00:54 user nova-compute[71474]: Nehalem Apr 21 14:00:54 user nova-compute[71474]: Apr 21 14:00:54 user nova-compute[71474]: Apr 21 14:00:54 user nova-compute[71474]: Apr 21 14:00:54 user nova-compute[71474]: Apr 21 14:00:54 user nova-compute[71474]: Apr 21 14:00:54 user nova-compute[71474]: Apr 21 14:00:54 user nova-compute[71474]: Apr 21 14:00:54 user nova-compute[71474]: Apr 21 14:00:54 user nova-compute[71474]: Apr 21 14:00:54 user nova-compute[71474]: Apr 21 14:00:54 user nova-compute[71474]: Apr 21 14:00:54 user nova-compute[71474]: Apr 21 14:00:54 user nova-compute[71474]: Apr 21 14:00:54 user nova-compute[71474]: Apr 21 14:00:54 user nova-compute[71474]: Apr 21 14:00:54 user nova-compute[71474]: Apr 21 14:00:54 user nova-compute[71474]: Apr 21 14:00:54 user nova-compute[71474]: Apr 21 14:00:54 user nova-compute[71474]: Apr 21 14:00:54 user nova-compute[71474]: Apr 21 14:00:54 user nova-compute[71474]: /dev/urandom Apr 21 14:00:54 user nova-compute[71474]: Apr 21 14:00:54 user nova-compute[71474]: Apr 21 14:00:54 user nova-compute[71474]: Apr 21 14:00:54 user nova-compute[71474]: Apr 21 14:00:54 user nova-compute[71474]: Apr 21 14:00:54 user nova-compute[71474]: Apr 21 14:00:54 user nova-compute[71474]: Apr 21 14:00:54 user nova-compute[71474]: {{(pid=71474) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 21 14:00:54 user nova-compute[71474]: DEBUG nova.virt.libvirt.vif [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T14:00:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-2136406868',display_name='tempest-AttachVolumeTestJSON-server-2136406868',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumetestjson-server-2136406868',id=12,image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNYfx0D/9HI6dRD4wXlpAeizOIf9VmC7glu2drchWSBPGsmZxukl0JKQLSPBGvOnDZea9iBw8HpwJNLK6oFXfEHEkUs1WkQz/KQVrF/Jrc/AnOokiNeEKYxBPPCAEmxZ0Q==',key_name='tempest-keypair-1493985608',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='91f5972380fd48eabffd46e6727239ce',ramdisk_id='',reservation_id='r-fvg0azuk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeTestJSON-1194238008',owner_user_name='tempest-AttachVolumeTestJSON-1194238008-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T14:00:52Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='113884844de14ec7ac8a20ba06a389b3',uuid=ef0a7b15-eab4-4705-9f70-9c9117736eb1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "662f0568-767d-4dd9-b220-2936b4d96745", "address": "fa:16:3e:bf:9c:e9", "network": {"id": "23a0f330-371d-4fe5-befe-bc4147bf09c7", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-656541543-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "91f5972380fd48eabffd46e6727239ce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap662f0568-76", "ovs_interfaceid": "662f0568-767d-4dd9-b220-2936b4d96745", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 21 14:00:54 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Converting VIF {"id": "662f0568-767d-4dd9-b220-2936b4d96745", "address": "fa:16:3e:bf:9c:e9", "network": {"id": "23a0f330-371d-4fe5-befe-bc4147bf09c7", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-656541543-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "91f5972380fd48eabffd46e6727239ce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap662f0568-76", "ovs_interfaceid": "662f0568-767d-4dd9-b220-2936b4d96745", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 14:00:54 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bf:9c:e9,bridge_name='br-int',has_traffic_filtering=True,id=662f0568-767d-4dd9-b220-2936b4d96745,network=Network(23a0f330-371d-4fe5-befe-bc4147bf09c7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap662f0568-76') {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 14:00:54 user nova-compute[71474]: DEBUG os_vif [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:bf:9c:e9,bridge_name='br-int',has_traffic_filtering=True,id=662f0568-767d-4dd9-b220-2936b4d96745,network=Network(23a0f330-371d-4fe5-befe-bc4147bf09c7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap662f0568-76') {{(pid=71474) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 21 14:00:54 user nova-compute[71474]: DEBUG nova.network.neutron [-] [instance: 90591d9b-6d6b-4f22-a3dc-fd83044df26b] Updating instance_info_cache with network_info: [] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:00:54 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:00:54 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 14:00:54 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 21 14:00:54 user nova-compute[71474]: INFO nova.compute.manager [-] [instance: 90591d9b-6d6b-4f22-a3dc-fd83044df26b] Took 0.90 seconds to deallocate network for instance. Apr 21 14:00:54 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:00:54 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap662f0568-76, may_exist=True) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 14:00:54 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap662f0568-76, col_values=(('external_ids', {'iface-id': '662f0568-767d-4dd9-b220-2936b4d96745', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:bf:9c:e9', 'vm-uuid': 'ef0a7b15-eab4-4705-9f70-9c9117736eb1'}),)) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 14:00:54 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:00:54 user nova-compute[71474]: DEBUG nova.compute.manager [req-8fa92328-d77e-4109-a8f7-9f12bb3a5dad req-edcb7560-07d4-4e12-bf96-a1a874ae093a service nova] [instance: 90591d9b-6d6b-4f22-a3dc-fd83044df26b] Detach interface failed, port_id=806c5236-f27a-4616-9ec6-85a2585f753a, reason: Instance 90591d9b-6d6b-4f22-a3dc-fd83044df26b could not be found. {{(pid=71474) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:10816}} Apr 21 14:00:54 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 14:00:54 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:00:54 user nova-compute[71474]: INFO os_vif [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:bf:9c:e9,bridge_name='br-int',has_traffic_filtering=True,id=662f0568-767d-4dd9-b220-2936b4d96745,network=Network(23a0f330-371d-4fe5-befe-bc4147bf09c7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap662f0568-76') Apr 21 14:00:54 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-e11dd171-27dc-4819-b654-49d63347a914 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:00:54 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-e11dd171-27dc-4819-b654-49d63347a914 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:00:54 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4f8622ba-dea6-454f-90c8-1f5f6a56e0b4/disk --force-share --output=json" returned: 0 in 0.148s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:00:54 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4f8622ba-dea6-454f-90c8-1f5f6a56e0b4/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:00:54 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] No BDM found with device name vda, not building metadata. {{(pid=71474) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 21 14:00:54 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] No VIF found with MAC fa:16:3e:bf:9c:e9, not building metadata {{(pid=71474) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 21 14:00:54 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4f8622ba-dea6-454f-90c8-1f5f6a56e0b4/disk --force-share --output=json" returned: 0 in 0.140s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:00:54 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2c5afe45-87ae-477a-8bf0-6a5e2036fb68/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:00:54 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2c5afe45-87ae-477a-8bf0-6a5e2036fb68/disk --force-share --output=json" returned: 0 in 0.144s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:00:54 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2c5afe45-87ae-477a-8bf0-6a5e2036fb68/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:00:54 user nova-compute[71474]: DEBUG nova.network.neutron [req-84a07417-c219-4078-8117-555e40733684 req-6b8ed968-67b8-4086-a487-378fb1e9246b service nova] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] Updated VIF entry in instance network info cache for port 662f0568-767d-4dd9-b220-2936b4d96745. {{(pid=71474) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 21 14:00:54 user nova-compute[71474]: DEBUG nova.network.neutron [req-84a07417-c219-4078-8117-555e40733684 req-6b8ed968-67b8-4086-a487-378fb1e9246b service nova] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] Updating instance_info_cache with network_info: [{"id": "662f0568-767d-4dd9-b220-2936b4d96745", "address": "fa:16:3e:bf:9c:e9", "network": {"id": "23a0f330-371d-4fe5-befe-bc4147bf09c7", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-656541543-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "91f5972380fd48eabffd46e6727239ce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap662f0568-76", "ovs_interfaceid": "662f0568-767d-4dd9-b220-2936b4d96745", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:00:54 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-e11dd171-27dc-4819-b654-49d63347a914 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] Inventory has not changed in ProviderTree for provider: 4e62c1ab-67bb-43ed-8389-61deb50e98d7 {{(pid=71474) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 14:00:54 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-84a07417-c219-4078-8117-555e40733684 req-6b8ed968-67b8-4086-a487-378fb1e9246b service nova] Releasing lock "refresh_cache-ef0a7b15-eab4-4705-9f70-9c9117736eb1" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 14:00:54 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-e11dd171-27dc-4819-b654-49d63347a914 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] Inventory has not changed for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 14:00:54 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-e11dd171-27dc-4819-b654-49d63347a914 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.424s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:00:54 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2c5afe45-87ae-477a-8bf0-6a5e2036fb68/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:00:54 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5e502c4c-a46b-4670-acba-2fda2d05adf5/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:00:54 user nova-compute[71474]: INFO nova.scheduler.client.report [None req-e11dd171-27dc-4819-b654-49d63347a914 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] Deleted allocations for instance 90591d9b-6d6b-4f22-a3dc-fd83044df26b Apr 21 14:00:55 user nova-compute[71474]: DEBUG nova.compute.manager [req-fee52047-f0ca-44ad-8acc-e493ef81b5de req-435c6ccd-bce8-4faa-84a1-5bbccbc19e41 service nova] [instance: 90591d9b-6d6b-4f22-a3dc-fd83044df26b] Received event network-vif-plugged-806c5236-f27a-4616-9ec6-85a2585f753a {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:00:55 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-fee52047-f0ca-44ad-8acc-e493ef81b5de req-435c6ccd-bce8-4faa-84a1-5bbccbc19e41 service nova] Acquiring lock "90591d9b-6d6b-4f22-a3dc-fd83044df26b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:00:55 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-fee52047-f0ca-44ad-8acc-e493ef81b5de req-435c6ccd-bce8-4faa-84a1-5bbccbc19e41 service nova] Lock "90591d9b-6d6b-4f22-a3dc-fd83044df26b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:00:55 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-fee52047-f0ca-44ad-8acc-e493ef81b5de req-435c6ccd-bce8-4faa-84a1-5bbccbc19e41 service nova] Lock "90591d9b-6d6b-4f22-a3dc-fd83044df26b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:00:55 user nova-compute[71474]: DEBUG nova.compute.manager [req-fee52047-f0ca-44ad-8acc-e493ef81b5de req-435c6ccd-bce8-4faa-84a1-5bbccbc19e41 service nova] [instance: 90591d9b-6d6b-4f22-a3dc-fd83044df26b] No waiting events found dispatching network-vif-plugged-806c5236-f27a-4616-9ec6-85a2585f753a {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:00:55 user nova-compute[71474]: WARNING nova.compute.manager [req-fee52047-f0ca-44ad-8acc-e493ef81b5de req-435c6ccd-bce8-4faa-84a1-5bbccbc19e41 service nova] [instance: 90591d9b-6d6b-4f22-a3dc-fd83044df26b] Received unexpected event network-vif-plugged-806c5236-f27a-4616-9ec6-85a2585f753a for instance with vm_state deleted and task_state None. Apr 21 14:00:55 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-e11dd171-27dc-4819-b654-49d63347a914 tempest-DeleteServersTestJSON-356048122 tempest-DeleteServersTestJSON-356048122-project-member] Lock "90591d9b-6d6b-4f22-a3dc-fd83044df26b" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.378s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:00:55 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5e502c4c-a46b-4670-acba-2fda2d05adf5/disk --force-share --output=json" returned: 0 in 0.138s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:00:55 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5e502c4c-a46b-4670-acba-2fda2d05adf5/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:00:55 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5e502c4c-a46b-4670-acba-2fda2d05adf5/disk --force-share --output=json" returned: 0 in 0.139s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:00:55 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/30068c4a-94ed-4b84-9178-0d554326fc68/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:00:55 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/30068c4a-94ed-4b84-9178-0d554326fc68/disk --force-share --output=json" returned: 0 in 0.129s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:00:55 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/30068c4a-94ed-4b84-9178-0d554326fc68/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:00:55 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/30068c4a-94ed-4b84-9178-0d554326fc68/disk --force-share --output=json" returned: 0 in 0.132s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:00:55 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f0f32b68-6993-4843-bcc6-bd0e06377b27/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:00:55 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f0f32b68-6993-4843-bcc6-bd0e06377b27/disk --force-share --output=json" returned: 0 in 0.126s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:00:55 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f0f32b68-6993-4843-bcc6-bd0e06377b27/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:00:55 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f0f32b68-6993-4843-bcc6-bd0e06377b27/disk --force-share --output=json" returned: 0 in 0.126s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:00:55 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:00:55 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:00:58 user nova-compute[71474]: DEBUG nova.virt.driver [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Emitting event Resumed> {{(pid=71474) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 14:00:58 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] VM Resumed (Lifecycle Event) Apr 21 14:00:58 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:00:58 user nova-compute[71474]: DEBUG nova.compute.manager [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] Instance event wait completed in 0 seconds for {{(pid=71474) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 21 14:00:58 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] Guest created on hypervisor {{(pid=71474) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 21 14:00:58 user nova-compute[71474]: DEBUG nova.compute.manager [req-df134d11-aa77-4544-87ac-879e60e335d5 req-e8b01452-dc2f-48d9-b58f-173d19293fb7 service nova] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] Received event network-vif-plugged-662f0568-767d-4dd9-b220-2936b4d96745 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:00:58 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-df134d11-aa77-4544-87ac-879e60e335d5 req-e8b01452-dc2f-48d9-b58f-173d19293fb7 service nova] Acquiring lock "ef0a7b15-eab4-4705-9f70-9c9117736eb1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:00:58 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-df134d11-aa77-4544-87ac-879e60e335d5 req-e8b01452-dc2f-48d9-b58f-173d19293fb7 service nova] Lock "ef0a7b15-eab4-4705-9f70-9c9117736eb1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:00:58 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-df134d11-aa77-4544-87ac-879e60e335d5 req-e8b01452-dc2f-48d9-b58f-173d19293fb7 service nova] Lock "ef0a7b15-eab4-4705-9f70-9c9117736eb1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:00:58 user nova-compute[71474]: DEBUG nova.compute.manager [req-df134d11-aa77-4544-87ac-879e60e335d5 req-e8b01452-dc2f-48d9-b58f-173d19293fb7 service nova] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] No waiting events found dispatching network-vif-plugged-662f0568-767d-4dd9-b220-2936b4d96745 {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:00:58 user nova-compute[71474]: WARNING nova.compute.manager [req-df134d11-aa77-4544-87ac-879e60e335d5 req-e8b01452-dc2f-48d9-b58f-173d19293fb7 service nova] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] Received unexpected event network-vif-plugged-662f0568-767d-4dd9-b220-2936b4d96745 for instance with vm_state building and task_state spawning. Apr 21 14:00:58 user nova-compute[71474]: DEBUG nova.compute.manager [req-80e5a461-559e-4438-9c8c-cb9adc53241b req-6bbdabfe-0211-479b-a8c4-5a7d07a0d89c service nova] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] Received event network-vif-plugged-662f0568-767d-4dd9-b220-2936b4d96745 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:00:58 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-80e5a461-559e-4438-9c8c-cb9adc53241b req-6bbdabfe-0211-479b-a8c4-5a7d07a0d89c service nova] Acquiring lock "ef0a7b15-eab4-4705-9f70-9c9117736eb1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:00:58 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-80e5a461-559e-4438-9c8c-cb9adc53241b req-6bbdabfe-0211-479b-a8c4-5a7d07a0d89c service nova] Lock "ef0a7b15-eab4-4705-9f70-9c9117736eb1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:00:58 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-80e5a461-559e-4438-9c8c-cb9adc53241b req-6bbdabfe-0211-479b-a8c4-5a7d07a0d89c service nova] Lock "ef0a7b15-eab4-4705-9f70-9c9117736eb1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:00:58 user nova-compute[71474]: DEBUG nova.compute.manager [req-80e5a461-559e-4438-9c8c-cb9adc53241b req-6bbdabfe-0211-479b-a8c4-5a7d07a0d89c service nova] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] No waiting events found dispatching network-vif-plugged-662f0568-767d-4dd9-b220-2936b4d96745 {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:00:58 user nova-compute[71474]: WARNING nova.compute.manager [req-80e5a461-559e-4438-9c8c-cb9adc53241b req-6bbdabfe-0211-479b-a8c4-5a7d07a0d89c service nova] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] Received unexpected event network-vif-plugged-662f0568-767d-4dd9-b220-2936b4d96745 for instance with vm_state building and task_state spawning. Apr 21 14:00:58 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 14:00:58 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71474) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 14:00:58 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 14:00:58 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 14:00:58 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Hypervisor/Node resource view: name=user free_ram=8209MB free_disk=26.046585083007812GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71474) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 21 14:00:58 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:00:58 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:00:58 user nova-compute[71474]: INFO nova.virt.libvirt.driver [-] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] Instance spawned successfully. Apr 21 14:00:58 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 21 14:00:58 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 14:00:58 user nova-compute[71474]: DEBUG nova.virt.driver [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Emitting event Started> {{(pid=71474) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 14:00:58 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] VM Started (Lifecycle Event) Apr 21 14:00:58 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] Found default for hw_cdrom_bus of ide {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:00:58 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] Found default for hw_disk_bus of virtio {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:00:58 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] Found default for hw_input_bus of None {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:00:58 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] Found default for hw_pointer_model of None {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:00:58 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] Found default for hw_video_model of virtio {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:00:58 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] Found default for hw_vif_model of virtio {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:00:58 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 14:00:58 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71474) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 14:00:58 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 14:00:58 user nova-compute[71474]: INFO nova.compute.manager [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] Took 6.32 seconds to spawn the instance on the hypervisor. Apr 21 14:00:58 user nova-compute[71474]: DEBUG nova.compute.manager [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 14:00:58 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Instance 30068c4a-94ed-4b84-9178-0d554326fc68 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71474) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 14:00:58 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Instance 2c5afe45-87ae-477a-8bf0-6a5e2036fb68 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71474) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 14:00:58 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Instance f0f32b68-6993-4843-bcc6-bd0e06377b27 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71474) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 14:00:58 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Instance 5e502c4c-a46b-4670-acba-2fda2d05adf5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71474) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 14:00:58 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Instance 2ae07df3-4bf4-44a5-a772-3507a6dde6ab actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71474) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 14:00:58 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Instance 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71474) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 14:00:58 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Instance 9164203a-8a6b-4078-bd98-c5ea7bc111fa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71474) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 14:00:58 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Instance ef0a7b15-eab4-4705-9f70-9c9117736eb1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71474) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 14:00:58 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Total usable vcpus: 12, total allocated vcpus: 8 {{(pid=71474) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 21 14:00:58 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Final resource view: name=user phys_ram=16023MB used_ram=1536MB phys_disk=40GB used_disk=8GB total_vcpus=12 used_vcpus=8 pci_stats=[] {{(pid=71474) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 21 14:00:58 user nova-compute[71474]: INFO nova.compute.manager [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] Took 7.00 seconds to build instance. Apr 21 14:00:58 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-994d5aa4-e395-4df9-89da-ce7ccd7d4464 tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Lock "ef0a7b15-eab4-4705-9f70-9c9117736eb1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 7.094s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:00:58 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Inventory has not changed in ProviderTree for provider: 4e62c1ab-67bb-43ed-8389-61deb50e98d7 {{(pid=71474) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 14:00:58 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Inventory has not changed for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 14:00:58 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Compute_service record updated for user:user {{(pid=71474) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 21 14:00:58 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.425s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:00:58 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:00:58 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:00:58 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:00:58 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:00:58 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71474) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 21 14:00:59 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:01:00 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Acquiring lock "96ecc039-866c-4a11-969f-cb59bd0a4f66" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:01:00 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Lock "96ecc039-866c-4a11-969f-cb59bd0a4f66" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:01:00 user nova-compute[71474]: DEBUG nova.compute.manager [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] [instance: 96ecc039-866c-4a11-969f-cb59bd0a4f66] Starting instance... {{(pid=71474) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 21 14:01:00 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:01:00 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.002s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:01:00 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71474) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 21 14:01:00 user nova-compute[71474]: INFO nova.compute.claims [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] [instance: 96ecc039-866c-4a11-969f-cb59bd0a4f66] Claim successful on node user Apr 21 14:01:00 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:01:00 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Inventory has not changed in ProviderTree for provider: 4e62c1ab-67bb-43ed-8389-61deb50e98d7 {{(pid=71474) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 14:01:00 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Inventory has not changed for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 14:01:00 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.392s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:01:00 user nova-compute[71474]: DEBUG nova.compute.manager [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] [instance: 96ecc039-866c-4a11-969f-cb59bd0a4f66] Start building networks asynchronously for instance. {{(pid=71474) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 21 14:01:00 user nova-compute[71474]: DEBUG nova.compute.manager [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] [instance: 96ecc039-866c-4a11-969f-cb59bd0a4f66] Allocating IP information in the background. {{(pid=71474) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 21 14:01:00 user nova-compute[71474]: DEBUG nova.network.neutron [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] [instance: 96ecc039-866c-4a11-969f-cb59bd0a4f66] allocate_for_instance() {{(pid=71474) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 21 14:01:00 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] [instance: 96ecc039-866c-4a11-969f-cb59bd0a4f66] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 21 14:01:01 user nova-compute[71474]: DEBUG nova.compute.manager [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] [instance: 96ecc039-866c-4a11-969f-cb59bd0a4f66] Start building block device mappings for instance. {{(pid=71474) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 21 14:01:01 user nova-compute[71474]: DEBUG nova.policy [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '92c19bad528a4c38860a43913b28b85b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8a8fedc10f324a92aef4142ab7efdd6a', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71474) authorize /opt/stack/nova/nova/policy.py:203}} Apr 21 14:01:01 user nova-compute[71474]: DEBUG nova.compute.manager [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] [instance: 96ecc039-866c-4a11-969f-cb59bd0a4f66] Start spawning the instance on the hypervisor. {{(pid=71474) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 21 14:01:01 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] [instance: 96ecc039-866c-4a11-969f-cb59bd0a4f66] Creating instance directory {{(pid=71474) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 21 14:01:01 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] [instance: 96ecc039-866c-4a11-969f-cb59bd0a4f66] Creating image(s) Apr 21 14:01:01 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Acquiring lock "/opt/stack/data/nova/instances/96ecc039-866c-4a11-969f-cb59bd0a4f66/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:01:01 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Lock "/opt/stack/data/nova/instances/96ecc039-866c-4a11-969f-cb59bd0a4f66/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:01:01 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Lock "/opt/stack/data/nova/instances/96ecc039-866c-4a11-969f-cb59bd0a4f66/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:01:01 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:01:01 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json" returned: 0 in 0.138s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:01:01 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Acquiring lock "8e8c288cb98f22f6af31ad55f38b7baa81c260d7" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:01:01 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Lock "8e8c288cb98f22f6af31ad55f38b7baa81c260d7" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:01:01 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:01:01 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json" returned: 0 in 0.144s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:01:01 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7,backing_fmt=raw /opt/stack/data/nova/instances/96ecc039-866c-4a11-969f-cb59bd0a4f66/disk 1073741824 {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:01:01 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7,backing_fmt=raw /opt/stack/data/nova/instances/96ecc039-866c-4a11-969f-cb59bd0a4f66/disk 1073741824" returned: 0 in 0.047s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:01:01 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Lock "8e8c288cb98f22f6af31ad55f38b7baa81c260d7" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.199s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:01:01 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:01:01 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json" returned: 0 in 0.145s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:01:01 user nova-compute[71474]: DEBUG nova.virt.disk.api [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Checking if we can resize image /opt/stack/data/nova/instances/96ecc039-866c-4a11-969f-cb59bd0a4f66/disk. size=1073741824 {{(pid=71474) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 21 14:01:01 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/96ecc039-866c-4a11-969f-cb59bd0a4f66/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:01:01 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/96ecc039-866c-4a11-969f-cb59bd0a4f66/disk --force-share --output=json" returned: 0 in 0.146s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:01:01 user nova-compute[71474]: DEBUG nova.virt.disk.api [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Cannot resize image /opt/stack/data/nova/instances/96ecc039-866c-4a11-969f-cb59bd0a4f66/disk to a smaller size. {{(pid=71474) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 21 14:01:01 user nova-compute[71474]: DEBUG nova.objects.instance [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Lazy-loading 'migration_context' on Instance uuid 96ecc039-866c-4a11-969f-cb59bd0a4f66 {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 14:01:01 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] [instance: 96ecc039-866c-4a11-969f-cb59bd0a4f66] Created local disks {{(pid=71474) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 21 14:01:01 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] [instance: 96ecc039-866c-4a11-969f-cb59bd0a4f66] Ensure instance console log exists: /opt/stack/data/nova/instances/96ecc039-866c-4a11-969f-cb59bd0a4f66/console.log {{(pid=71474) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 21 14:01:01 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:01:01 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:01:01 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:01:01 user nova-compute[71474]: DEBUG nova.network.neutron [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] [instance: 96ecc039-866c-4a11-969f-cb59bd0a4f66] Successfully created port: 94f4ff5c-4646-4e4d-814a-40e8e72ad32e {{(pid=71474) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 21 14:01:03 user nova-compute[71474]: DEBUG nova.network.neutron [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] [instance: 96ecc039-866c-4a11-969f-cb59bd0a4f66] Successfully updated port: 94f4ff5c-4646-4e4d-814a-40e8e72ad32e {{(pid=71474) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 21 14:01:03 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Acquiring lock "refresh_cache-96ecc039-866c-4a11-969f-cb59bd0a4f66" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 14:01:03 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Acquired lock "refresh_cache-96ecc039-866c-4a11-969f-cb59bd0a4f66" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 14:01:03 user nova-compute[71474]: DEBUG nova.network.neutron [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] [instance: 96ecc039-866c-4a11-969f-cb59bd0a4f66] Building network info cache for instance {{(pid=71474) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 21 14:01:03 user nova-compute[71474]: DEBUG nova.compute.manager [req-5c6a19ca-20ca-4a96-a8e2-03e5139aa169 req-c83b51ab-2a45-40f7-bf88-000e083f48a9 service nova] [instance: 96ecc039-866c-4a11-969f-cb59bd0a4f66] Received event network-changed-94f4ff5c-4646-4e4d-814a-40e8e72ad32e {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:01:03 user nova-compute[71474]: DEBUG nova.compute.manager [req-5c6a19ca-20ca-4a96-a8e2-03e5139aa169 req-c83b51ab-2a45-40f7-bf88-000e083f48a9 service nova] [instance: 96ecc039-866c-4a11-969f-cb59bd0a4f66] Refreshing instance network info cache due to event network-changed-94f4ff5c-4646-4e4d-814a-40e8e72ad32e. {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 21 14:01:03 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-5c6a19ca-20ca-4a96-a8e2-03e5139aa169 req-c83b51ab-2a45-40f7-bf88-000e083f48a9 service nova] Acquiring lock "refresh_cache-96ecc039-866c-4a11-969f-cb59bd0a4f66" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 14:01:03 user nova-compute[71474]: DEBUG nova.network.neutron [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] [instance: 96ecc039-866c-4a11-969f-cb59bd0a4f66] Instance cache missing network info. {{(pid=71474) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 21 14:01:03 user nova-compute[71474]: DEBUG nova.network.neutron [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] [instance: 96ecc039-866c-4a11-969f-cb59bd0a4f66] Updating instance_info_cache with network_info: [{"id": "94f4ff5c-4646-4e4d-814a-40e8e72ad32e", "address": "fa:16:3e:aa:82:d6", "network": {"id": "1815b48e-38a4-4a83-a23b-d7c2ce38a2c3", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1971948253-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8a8fedc10f324a92aef4142ab7efdd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap94f4ff5c-46", "ovs_interfaceid": "94f4ff5c-4646-4e4d-814a-40e8e72ad32e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:01:03 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Releasing lock "refresh_cache-96ecc039-866c-4a11-969f-cb59bd0a4f66" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 14:01:03 user nova-compute[71474]: DEBUG nova.compute.manager [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] [instance: 96ecc039-866c-4a11-969f-cb59bd0a4f66] Instance network_info: |[{"id": "94f4ff5c-4646-4e4d-814a-40e8e72ad32e", "address": "fa:16:3e:aa:82:d6", "network": {"id": "1815b48e-38a4-4a83-a23b-d7c2ce38a2c3", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1971948253-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8a8fedc10f324a92aef4142ab7efdd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap94f4ff5c-46", "ovs_interfaceid": "94f4ff5c-4646-4e4d-814a-40e8e72ad32e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71474) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 21 14:01:03 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-5c6a19ca-20ca-4a96-a8e2-03e5139aa169 req-c83b51ab-2a45-40f7-bf88-000e083f48a9 service nova] Acquired lock "refresh_cache-96ecc039-866c-4a11-969f-cb59bd0a4f66" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 14:01:03 user nova-compute[71474]: DEBUG nova.network.neutron [req-5c6a19ca-20ca-4a96-a8e2-03e5139aa169 req-c83b51ab-2a45-40f7-bf88-000e083f48a9 service nova] [instance: 96ecc039-866c-4a11-969f-cb59bd0a4f66] Refreshing network info cache for port 94f4ff5c-4646-4e4d-814a-40e8e72ad32e {{(pid=71474) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 21 14:01:03 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] [instance: 96ecc039-866c-4a11-969f-cb59bd0a4f66] Start _get_guest_xml network_info=[{"id": "94f4ff5c-4646-4e4d-814a-40e8e72ad32e", "address": "fa:16:3e:aa:82:d6", "network": {"id": "1815b48e-38a4-4a83-a23b-d7c2ce38a2c3", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1971948253-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8a8fedc10f324a92aef4142ab7efdd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap94f4ff5c-46", "ovs_interfaceid": "94f4ff5c-4646-4e4d-814a-40e8e72ad32e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T13:54:16Z,direct_url=,disk_format='qcow2',id=2edfef44-2867-4e03-a53e-b139f99afa75,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='36a44032fda748c1965c722304fa176d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T13:54:18Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'guest_format': None, 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'image_id': '2edfef44-2867-4e03-a53e-b139f99afa75'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71474) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 21 14:01:03 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 14:01:03 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 14:01:03 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71474) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 21 14:01:03 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-21T13:55:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T13:54:16Z,direct_url=,disk_format='qcow2',id=2edfef44-2867-4e03-a53e-b139f99afa75,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='36a44032fda748c1965c722304fa176d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T13:54:18Z,virtual_size=,visibility=), allow threads: True {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 21 14:01:03 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Flavor limits 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 21 14:01:03 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Image limits 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 21 14:01:03 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Flavor pref 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 21 14:01:03 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Image pref 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 21 14:01:03 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 21 14:01:03 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 21 14:01:03 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71474) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 21 14:01:03 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Got 1 possible topologies {{(pid=71474) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 21 14:01:03 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 21 14:01:03 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 21 14:01:03 user nova-compute[71474]: DEBUG nova.virt.libvirt.vif [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T14:01:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-1998626577',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumeshelvetestjson-server-1998626577',id=13,image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEjLzbn3Rtk7UAvXNCb/CqU3NFzUWrQkjNzooEbdsk7L34/ttrJQbQ8G2+mwvgH50DTejh2ROEqL19gr64B+vVPiL7Dti7Dkj0m8tNJC6vM/rbQizA3VE78YsalZDJlEwQ==',key_name='tempest-keypair-33325504',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8a8fedc10f324a92aef4142ab7efdd6a',ramdisk_id='',reservation_id='r-n7znc46z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeShelveTestJSON-2115713901',owner_user_name='tempest-AttachVolumeShelveTestJSON-2115713901-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T14:01:01Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='92c19bad528a4c38860a43913b28b85b',uuid=96ecc039-866c-4a11-969f-cb59bd0a4f66,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "94f4ff5c-4646-4e4d-814a-40e8e72ad32e", "address": "fa:16:3e:aa:82:d6", "network": {"id": "1815b48e-38a4-4a83-a23b-d7c2ce38a2c3", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1971948253-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8a8fedc10f324a92aef4142ab7efdd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap94f4ff5c-46", "ovs_interfaceid": "94f4ff5c-4646-4e4d-814a-40e8e72ad32e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71474) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 21 14:01:03 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Converting VIF {"id": "94f4ff5c-4646-4e4d-814a-40e8e72ad32e", "address": "fa:16:3e:aa:82:d6", "network": {"id": "1815b48e-38a4-4a83-a23b-d7c2ce38a2c3", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1971948253-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8a8fedc10f324a92aef4142ab7efdd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap94f4ff5c-46", "ovs_interfaceid": "94f4ff5c-4646-4e4d-814a-40e8e72ad32e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 14:01:03 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:aa:82:d6,bridge_name='br-int',has_traffic_filtering=True,id=94f4ff5c-4646-4e4d-814a-40e8e72ad32e,network=Network(1815b48e-38a4-4a83-a23b-d7c2ce38a2c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap94f4ff5c-46') {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 14:01:03 user nova-compute[71474]: DEBUG nova.objects.instance [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Lazy-loading 'pci_devices' on Instance uuid 96ecc039-866c-4a11-969f-cb59bd0a4f66 {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 14:01:03 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] [instance: 96ecc039-866c-4a11-969f-cb59bd0a4f66] End _get_guest_xml xml= Apr 21 14:01:03 user nova-compute[71474]: 96ecc039-866c-4a11-969f-cb59bd0a4f66 Apr 21 14:01:03 user nova-compute[71474]: instance-0000000d Apr 21 14:01:03 user nova-compute[71474]: 131072 Apr 21 14:01:03 user nova-compute[71474]: 1 Apr 21 14:01:03 user nova-compute[71474]: Apr 21 14:01:03 user nova-compute[71474]: Apr 21 14:01:03 user nova-compute[71474]: Apr 21 14:01:03 user nova-compute[71474]: tempest-AttachVolumeShelveTestJSON-server-1998626577 Apr 21 14:01:03 user nova-compute[71474]: 2023-04-21 14:01:03 Apr 21 14:01:03 user nova-compute[71474]: Apr 21 14:01:03 user nova-compute[71474]: 128 Apr 21 14:01:03 user nova-compute[71474]: 1 Apr 21 14:01:03 user nova-compute[71474]: 0 Apr 21 14:01:03 user nova-compute[71474]: 0 Apr 21 14:01:03 user nova-compute[71474]: 1 Apr 21 14:01:03 user nova-compute[71474]: Apr 21 14:01:03 user nova-compute[71474]: Apr 21 14:01:03 user nova-compute[71474]: tempest-AttachVolumeShelveTestJSON-2115713901-project-member Apr 21 14:01:03 user nova-compute[71474]: tempest-AttachVolumeShelveTestJSON-2115713901 Apr 21 14:01:03 user nova-compute[71474]: Apr 21 14:01:03 user nova-compute[71474]: Apr 21 14:01:03 user nova-compute[71474]: Apr 21 14:01:03 user nova-compute[71474]: Apr 21 14:01:03 user nova-compute[71474]: Apr 21 14:01:03 user nova-compute[71474]: Apr 21 14:01:03 user nova-compute[71474]: Apr 21 14:01:03 user nova-compute[71474]: Apr 21 14:01:03 user nova-compute[71474]: Apr 21 14:01:03 user nova-compute[71474]: Apr 21 14:01:03 user nova-compute[71474]: Apr 21 14:01:03 user nova-compute[71474]: OpenStack Foundation Apr 21 14:01:03 user nova-compute[71474]: OpenStack Nova Apr 21 14:01:03 user nova-compute[71474]: 0.0.0 Apr 21 14:01:03 user nova-compute[71474]: 96ecc039-866c-4a11-969f-cb59bd0a4f66 Apr 21 14:01:03 user nova-compute[71474]: 96ecc039-866c-4a11-969f-cb59bd0a4f66 Apr 21 14:01:03 user nova-compute[71474]: Virtual Machine Apr 21 14:01:03 user nova-compute[71474]: Apr 21 14:01:03 user nova-compute[71474]: Apr 21 14:01:03 user nova-compute[71474]: Apr 21 14:01:03 user nova-compute[71474]: hvm Apr 21 14:01:03 user nova-compute[71474]: Apr 21 14:01:03 user nova-compute[71474]: Apr 21 14:01:03 user nova-compute[71474]: Apr 21 14:01:03 user nova-compute[71474]: Apr 21 14:01:03 user nova-compute[71474]: Apr 21 14:01:03 user nova-compute[71474]: Apr 21 14:01:03 user nova-compute[71474]: Apr 21 14:01:03 user nova-compute[71474]: Apr 21 14:01:03 user nova-compute[71474]: Apr 21 14:01:03 user nova-compute[71474]: Apr 21 14:01:03 user nova-compute[71474]: Apr 21 14:01:03 user nova-compute[71474]: Apr 21 14:01:03 user nova-compute[71474]: Apr 21 14:01:03 user nova-compute[71474]: Apr 21 14:01:03 user nova-compute[71474]: Nehalem Apr 21 14:01:03 user nova-compute[71474]: Apr 21 14:01:03 user nova-compute[71474]: Apr 21 14:01:03 user nova-compute[71474]: Apr 21 14:01:03 user nova-compute[71474]: Apr 21 14:01:03 user nova-compute[71474]: Apr 21 14:01:03 user nova-compute[71474]: Apr 21 14:01:03 user nova-compute[71474]: Apr 21 14:01:03 user nova-compute[71474]: Apr 21 14:01:03 user nova-compute[71474]: Apr 21 14:01:03 user nova-compute[71474]: Apr 21 14:01:03 user nova-compute[71474]: Apr 21 14:01:03 user nova-compute[71474]: Apr 21 14:01:03 user nova-compute[71474]: Apr 21 14:01:03 user nova-compute[71474]: Apr 21 14:01:03 user nova-compute[71474]: Apr 21 14:01:03 user nova-compute[71474]: Apr 21 14:01:03 user nova-compute[71474]: Apr 21 14:01:03 user nova-compute[71474]: Apr 21 14:01:03 user nova-compute[71474]: Apr 21 14:01:03 user nova-compute[71474]: Apr 21 14:01:03 user nova-compute[71474]: /dev/urandom Apr 21 14:01:03 user nova-compute[71474]: Apr 21 14:01:03 user nova-compute[71474]: Apr 21 14:01:03 user nova-compute[71474]: Apr 21 14:01:03 user nova-compute[71474]: Apr 21 14:01:03 user nova-compute[71474]: Apr 21 14:01:03 user nova-compute[71474]: Apr 21 14:01:03 user nova-compute[71474]: Apr 21 14:01:03 user nova-compute[71474]: {{(pid=71474) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 21 14:01:03 user nova-compute[71474]: DEBUG nova.virt.libvirt.vif [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T14:01:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-1998626577',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumeshelvetestjson-server-1998626577',id=13,image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEjLzbn3Rtk7UAvXNCb/CqU3NFzUWrQkjNzooEbdsk7L34/ttrJQbQ8G2+mwvgH50DTejh2ROEqL19gr64B+vVPiL7Dti7Dkj0m8tNJC6vM/rbQizA3VE78YsalZDJlEwQ==',key_name='tempest-keypair-33325504',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8a8fedc10f324a92aef4142ab7efdd6a',ramdisk_id='',reservation_id='r-n7znc46z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeShelveTestJSON-2115713901',owner_user_name='tempest-AttachVolumeShelveTestJSON-2115713901-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T14:01:01Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='92c19bad528a4c38860a43913b28b85b',uuid=96ecc039-866c-4a11-969f-cb59bd0a4f66,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "94f4ff5c-4646-4e4d-814a-40e8e72ad32e", "address": "fa:16:3e:aa:82:d6", "network": {"id": "1815b48e-38a4-4a83-a23b-d7c2ce38a2c3", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1971948253-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8a8fedc10f324a92aef4142ab7efdd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap94f4ff5c-46", "ovs_interfaceid": "94f4ff5c-4646-4e4d-814a-40e8e72ad32e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 21 14:01:03 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Converting VIF {"id": "94f4ff5c-4646-4e4d-814a-40e8e72ad32e", "address": "fa:16:3e:aa:82:d6", "network": {"id": "1815b48e-38a4-4a83-a23b-d7c2ce38a2c3", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1971948253-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8a8fedc10f324a92aef4142ab7efdd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap94f4ff5c-46", "ovs_interfaceid": "94f4ff5c-4646-4e4d-814a-40e8e72ad32e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 14:01:03 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:aa:82:d6,bridge_name='br-int',has_traffic_filtering=True,id=94f4ff5c-4646-4e4d-814a-40e8e72ad32e,network=Network(1815b48e-38a4-4a83-a23b-d7c2ce38a2c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap94f4ff5c-46') {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 14:01:03 user nova-compute[71474]: DEBUG os_vif [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:aa:82:d6,bridge_name='br-int',has_traffic_filtering=True,id=94f4ff5c-4646-4e4d-814a-40e8e72ad32e,network=Network(1815b48e-38a4-4a83-a23b-d7c2ce38a2c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap94f4ff5c-46') {{(pid=71474) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 21 14:01:03 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:01:03 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 14:01:03 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 21 14:01:03 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:01:03 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap94f4ff5c-46, may_exist=True) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 14:01:03 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap94f4ff5c-46, col_values=(('external_ids', {'iface-id': '94f4ff5c-4646-4e4d-814a-40e8e72ad32e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:aa:82:d6', 'vm-uuid': '96ecc039-866c-4a11-969f-cb59bd0a4f66'}),)) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 14:01:03 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:01:03 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 14:01:03 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:01:03 user nova-compute[71474]: INFO os_vif [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:aa:82:d6,bridge_name='br-int',has_traffic_filtering=True,id=94f4ff5c-4646-4e4d-814a-40e8e72ad32e,network=Network(1815b48e-38a4-4a83-a23b-d7c2ce38a2c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap94f4ff5c-46') Apr 21 14:01:03 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] No BDM found with device name vda, not building metadata. {{(pid=71474) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 21 14:01:03 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] No VIF found with MAC fa:16:3e:aa:82:d6, not building metadata {{(pid=71474) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 21 14:01:04 user nova-compute[71474]: DEBUG nova.network.neutron [req-5c6a19ca-20ca-4a96-a8e2-03e5139aa169 req-c83b51ab-2a45-40f7-bf88-000e083f48a9 service nova] [instance: 96ecc039-866c-4a11-969f-cb59bd0a4f66] Updated VIF entry in instance network info cache for port 94f4ff5c-4646-4e4d-814a-40e8e72ad32e. {{(pid=71474) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 21 14:01:04 user nova-compute[71474]: DEBUG nova.network.neutron [req-5c6a19ca-20ca-4a96-a8e2-03e5139aa169 req-c83b51ab-2a45-40f7-bf88-000e083f48a9 service nova] [instance: 96ecc039-866c-4a11-969f-cb59bd0a4f66] Updating instance_info_cache with network_info: [{"id": "94f4ff5c-4646-4e4d-814a-40e8e72ad32e", "address": "fa:16:3e:aa:82:d6", "network": {"id": "1815b48e-38a4-4a83-a23b-d7c2ce38a2c3", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1971948253-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8a8fedc10f324a92aef4142ab7efdd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap94f4ff5c-46", "ovs_interfaceid": "94f4ff5c-4646-4e4d-814a-40e8e72ad32e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:01:04 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-5c6a19ca-20ca-4a96-a8e2-03e5139aa169 req-c83b51ab-2a45-40f7-bf88-000e083f48a9 service nova] Releasing lock "refresh_cache-96ecc039-866c-4a11-969f-cb59bd0a4f66" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 14:01:04 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:01:04 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:01:04 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:01:05 user nova-compute[71474]: DEBUG nova.compute.manager [req-429c97e4-3e6a-4cc7-abf0-c5bb85f74e45 req-10e1e4a5-ef94-4790-8b8c-c3c552f19886 service nova] [instance: 96ecc039-866c-4a11-969f-cb59bd0a4f66] Received event network-vif-plugged-94f4ff5c-4646-4e4d-814a-40e8e72ad32e {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:01:05 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-429c97e4-3e6a-4cc7-abf0-c5bb85f74e45 req-10e1e4a5-ef94-4790-8b8c-c3c552f19886 service nova] Acquiring lock "96ecc039-866c-4a11-969f-cb59bd0a4f66-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:01:05 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-429c97e4-3e6a-4cc7-abf0-c5bb85f74e45 req-10e1e4a5-ef94-4790-8b8c-c3c552f19886 service nova] Lock "96ecc039-866c-4a11-969f-cb59bd0a4f66-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:01:05 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-429c97e4-3e6a-4cc7-abf0-c5bb85f74e45 req-10e1e4a5-ef94-4790-8b8c-c3c552f19886 service nova] Lock "96ecc039-866c-4a11-969f-cb59bd0a4f66-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:01:05 user nova-compute[71474]: DEBUG nova.compute.manager [req-429c97e4-3e6a-4cc7-abf0-c5bb85f74e45 req-10e1e4a5-ef94-4790-8b8c-c3c552f19886 service nova] [instance: 96ecc039-866c-4a11-969f-cb59bd0a4f66] No waiting events found dispatching network-vif-plugged-94f4ff5c-4646-4e4d-814a-40e8e72ad32e {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:01:05 user nova-compute[71474]: WARNING nova.compute.manager [req-429c97e4-3e6a-4cc7-abf0-c5bb85f74e45 req-10e1e4a5-ef94-4790-8b8c-c3c552f19886 service nova] [instance: 96ecc039-866c-4a11-969f-cb59bd0a4f66] Received unexpected event network-vif-plugged-94f4ff5c-4646-4e4d-814a-40e8e72ad32e for instance with vm_state building and task_state spawning. Apr 21 14:01:05 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:01:05 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:01:05 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:01:05 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:01:07 user nova-compute[71474]: DEBUG nova.compute.manager [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] [instance: 96ecc039-866c-4a11-969f-cb59bd0a4f66] Instance event wait completed in 0 seconds for {{(pid=71474) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 21 14:01:07 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] [instance: 96ecc039-866c-4a11-969f-cb59bd0a4f66] Guest created on hypervisor {{(pid=71474) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 21 14:01:07 user nova-compute[71474]: DEBUG nova.virt.driver [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Emitting event Resumed> {{(pid=71474) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 14:01:07 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 96ecc039-866c-4a11-969f-cb59bd0a4f66] VM Resumed (Lifecycle Event) Apr 21 14:01:07 user nova-compute[71474]: INFO nova.virt.libvirt.driver [-] [instance: 96ecc039-866c-4a11-969f-cb59bd0a4f66] Instance spawned successfully. Apr 21 14:01:07 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] [instance: 96ecc039-866c-4a11-969f-cb59bd0a4f66] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 21 14:01:07 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 96ecc039-866c-4a11-969f-cb59bd0a4f66] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 14:01:07 user nova-compute[71474]: DEBUG nova.compute.manager [req-df47348b-390d-4241-854d-c98948b2823d req-ee1cc8f4-507a-4865-84a7-83514cd8fd64 service nova] [instance: 96ecc039-866c-4a11-969f-cb59bd0a4f66] Received event network-vif-plugged-94f4ff5c-4646-4e4d-814a-40e8e72ad32e {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:01:07 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-df47348b-390d-4241-854d-c98948b2823d req-ee1cc8f4-507a-4865-84a7-83514cd8fd64 service nova] Acquiring lock "96ecc039-866c-4a11-969f-cb59bd0a4f66-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:01:07 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-df47348b-390d-4241-854d-c98948b2823d req-ee1cc8f4-507a-4865-84a7-83514cd8fd64 service nova] Lock "96ecc039-866c-4a11-969f-cb59bd0a4f66-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:01:07 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-df47348b-390d-4241-854d-c98948b2823d req-ee1cc8f4-507a-4865-84a7-83514cd8fd64 service nova] Lock "96ecc039-866c-4a11-969f-cb59bd0a4f66-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:01:07 user nova-compute[71474]: DEBUG nova.compute.manager [req-df47348b-390d-4241-854d-c98948b2823d req-ee1cc8f4-507a-4865-84a7-83514cd8fd64 service nova] [instance: 96ecc039-866c-4a11-969f-cb59bd0a4f66] No waiting events found dispatching network-vif-plugged-94f4ff5c-4646-4e4d-814a-40e8e72ad32e {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:01:07 user nova-compute[71474]: WARNING nova.compute.manager [req-df47348b-390d-4241-854d-c98948b2823d req-ee1cc8f4-507a-4865-84a7-83514cd8fd64 service nova] [instance: 96ecc039-866c-4a11-969f-cb59bd0a4f66] Received unexpected event network-vif-plugged-94f4ff5c-4646-4e4d-814a-40e8e72ad32e for instance with vm_state building and task_state spawning. Apr 21 14:01:07 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 96ecc039-866c-4a11-969f-cb59bd0a4f66] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71474) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 14:01:07 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] [instance: 96ecc039-866c-4a11-969f-cb59bd0a4f66] Found default for hw_cdrom_bus of ide {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:01:07 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] [instance: 96ecc039-866c-4a11-969f-cb59bd0a4f66] Found default for hw_disk_bus of virtio {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:01:07 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] [instance: 96ecc039-866c-4a11-969f-cb59bd0a4f66] Found default for hw_input_bus of None {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:01:07 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] [instance: 96ecc039-866c-4a11-969f-cb59bd0a4f66] Found default for hw_pointer_model of None {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:01:07 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] [instance: 96ecc039-866c-4a11-969f-cb59bd0a4f66] Found default for hw_video_model of virtio {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:01:07 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] [instance: 96ecc039-866c-4a11-969f-cb59bd0a4f66] Found default for hw_vif_model of virtio {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:01:07 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 96ecc039-866c-4a11-969f-cb59bd0a4f66] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 14:01:07 user nova-compute[71474]: DEBUG nova.virt.driver [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Emitting event Started> {{(pid=71474) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 14:01:07 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 96ecc039-866c-4a11-969f-cb59bd0a4f66] VM Started (Lifecycle Event) Apr 21 14:01:07 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 96ecc039-866c-4a11-969f-cb59bd0a4f66] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 14:01:07 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 96ecc039-866c-4a11-969f-cb59bd0a4f66] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71474) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 14:01:07 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 96ecc039-866c-4a11-969f-cb59bd0a4f66] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 14:01:07 user nova-compute[71474]: INFO nova.compute.manager [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] [instance: 96ecc039-866c-4a11-969f-cb59bd0a4f66] Took 6.21 seconds to spawn the instance on the hypervisor. Apr 21 14:01:07 user nova-compute[71474]: DEBUG nova.compute.manager [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] [instance: 96ecc039-866c-4a11-969f-cb59bd0a4f66] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 14:01:07 user nova-compute[71474]: INFO nova.compute.manager [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] [instance: 96ecc039-866c-4a11-969f-cb59bd0a4f66] Took 6.92 seconds to build instance. Apr 21 14:01:07 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-8f553bc4-f568-4262-b1a2-e63718ffed39 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Lock "96ecc039-866c-4a11-969f-cb59bd0a4f66" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 7.026s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:01:08 user nova-compute[71474]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71474) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 14:01:08 user nova-compute[71474]: INFO nova.compute.manager [-] [instance: 90591d9b-6d6b-4f22-a3dc-fd83044df26b] VM Stopped (Lifecycle Event) Apr 21 14:01:08 user nova-compute[71474]: DEBUG nova.compute.manager [None req-8ebe8be2-de17-40f4-86d0-b11dbfe80a64 None None] [instance: 90591d9b-6d6b-4f22-a3dc-fd83044df26b] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 14:01:08 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:01:10 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:01:13 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-16397692-199a-407a-b8c2-80319b74e6a3 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] Acquiring lock "2ae07df3-4bf4-44a5-a772-3507a6dde6ab" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:01:13 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-16397692-199a-407a-b8c2-80319b74e6a3 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] Lock "2ae07df3-4bf4-44a5-a772-3507a6dde6ab" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:01:13 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-16397692-199a-407a-b8c2-80319b74e6a3 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] Acquiring lock "2ae07df3-4bf4-44a5-a772-3507a6dde6ab-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:01:13 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-16397692-199a-407a-b8c2-80319b74e6a3 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] Lock "2ae07df3-4bf4-44a5-a772-3507a6dde6ab-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:01:13 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-16397692-199a-407a-b8c2-80319b74e6a3 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] Lock "2ae07df3-4bf4-44a5-a772-3507a6dde6ab-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:01:13 user nova-compute[71474]: INFO nova.compute.manager [None req-16397692-199a-407a-b8c2-80319b74e6a3 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] [instance: 2ae07df3-4bf4-44a5-a772-3507a6dde6ab] Terminating instance Apr 21 14:01:13 user nova-compute[71474]: DEBUG nova.compute.manager [None req-16397692-199a-407a-b8c2-80319b74e6a3 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] [instance: 2ae07df3-4bf4-44a5-a772-3507a6dde6ab] Start destroying the instance on the hypervisor. {{(pid=71474) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 21 14:01:13 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:01:13 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:01:13 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:01:13 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:01:13 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:01:13 user nova-compute[71474]: DEBUG nova.compute.manager [req-98983001-48b2-4c6d-9a66-b56b1e070c2f req-d3c2eaf0-680e-4c70-9c7e-0929520cf779 service nova] [instance: 2ae07df3-4bf4-44a5-a772-3507a6dde6ab] Received event network-vif-unplugged-15758bd1-cf67-4bc6-9408-9740cd79d26d {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:01:13 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-98983001-48b2-4c6d-9a66-b56b1e070c2f req-d3c2eaf0-680e-4c70-9c7e-0929520cf779 service nova] Acquiring lock "2ae07df3-4bf4-44a5-a772-3507a6dde6ab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:01:13 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-98983001-48b2-4c6d-9a66-b56b1e070c2f req-d3c2eaf0-680e-4c70-9c7e-0929520cf779 service nova] Lock "2ae07df3-4bf4-44a5-a772-3507a6dde6ab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:01:13 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-98983001-48b2-4c6d-9a66-b56b1e070c2f req-d3c2eaf0-680e-4c70-9c7e-0929520cf779 service nova] Lock "2ae07df3-4bf4-44a5-a772-3507a6dde6ab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:01:13 user nova-compute[71474]: DEBUG nova.compute.manager [req-98983001-48b2-4c6d-9a66-b56b1e070c2f req-d3c2eaf0-680e-4c70-9c7e-0929520cf779 service nova] [instance: 2ae07df3-4bf4-44a5-a772-3507a6dde6ab] No waiting events found dispatching network-vif-unplugged-15758bd1-cf67-4bc6-9408-9740cd79d26d {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:01:13 user nova-compute[71474]: DEBUG nova.compute.manager [req-98983001-48b2-4c6d-9a66-b56b1e070c2f req-d3c2eaf0-680e-4c70-9c7e-0929520cf779 service nova] [instance: 2ae07df3-4bf4-44a5-a772-3507a6dde6ab] Received event network-vif-unplugged-15758bd1-cf67-4bc6-9408-9740cd79d26d for instance with task_state deleting. {{(pid=71474) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 21 14:01:14 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:01:14 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:01:14 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:01:14 user nova-compute[71474]: INFO nova.virt.libvirt.driver [-] [instance: 2ae07df3-4bf4-44a5-a772-3507a6dde6ab] Instance destroyed successfully. Apr 21 14:01:14 user nova-compute[71474]: DEBUG nova.objects.instance [None req-16397692-199a-407a-b8c2-80319b74e6a3 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] Lazy-loading 'resources' on Instance uuid 2ae07df3-4bf4-44a5-a772-3507a6dde6ab {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 14:01:14 user nova-compute[71474]: DEBUG nova.virt.libvirt.vif [None req-16397692-199a-407a-b8c2-80319b74e6a3 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T13:59:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-SnapshotDataIntegrityTests-server-781003247',display_name='tempest-SnapshotDataIntegrityTests-server-781003247',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-snapshotdataintegritytests-server-781003247',id=9,image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKR10aEOjapD46pprj3PVJ6Gx5H8VQYM8ILWN6S/kFyLgzYn0q969VADuMdlZwliZmFRI4vN1i/LRJPIe9UBMJ19tPqMq6iASFPFIt5SbKZXfihMoI1E5AGB6AltaW1bLw==',key_name='tempest-SnapshotDataIntegrityTests-1470211823',keypairs=,launch_index=0,launched_at=2023-04-21T13:59:27Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='9daf036d4ad84586a628c454408e3d7d',ramdisk_id='',reservation_id='r-bi8kumkc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-SnapshotDataIntegrityTests-1600761065',owner_user_name='tempest-SnapshotDataIntegrityTests-1600761065-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-21T13:59:28Z,user_data=None,user_id='1a2438d69a684df69e1de2edddc73bc0',uuid=2ae07df3-4bf4-44a5-a772-3507a6dde6ab,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "15758bd1-cf67-4bc6-9408-9740cd79d26d", "address": "fa:16:3e:19:e6:4f", "network": {"id": "ee3596e4-eb99-458a-b7ae-a48f8bcd58c7", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-1394515631-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "9daf036d4ad84586a628c454408e3d7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap15758bd1-cf", "ovs_interfaceid": "15758bd1-cf67-4bc6-9408-9740cd79d26d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 21 14:01:14 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-16397692-199a-407a-b8c2-80319b74e6a3 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] Converting VIF {"id": "15758bd1-cf67-4bc6-9408-9740cd79d26d", "address": "fa:16:3e:19:e6:4f", "network": {"id": "ee3596e4-eb99-458a-b7ae-a48f8bcd58c7", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-1394515631-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "9daf036d4ad84586a628c454408e3d7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap15758bd1-cf", "ovs_interfaceid": "15758bd1-cf67-4bc6-9408-9740cd79d26d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 14:01:14 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-16397692-199a-407a-b8c2-80319b74e6a3 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:19:e6:4f,bridge_name='br-int',has_traffic_filtering=True,id=15758bd1-cf67-4bc6-9408-9740cd79d26d,network=Network(ee3596e4-eb99-458a-b7ae-a48f8bcd58c7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap15758bd1-cf') {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 14:01:14 user nova-compute[71474]: DEBUG os_vif [None req-16397692-199a-407a-b8c2-80319b74e6a3 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:19:e6:4f,bridge_name='br-int',has_traffic_filtering=True,id=15758bd1-cf67-4bc6-9408-9740cd79d26d,network=Network(ee3596e4-eb99-458a-b7ae-a48f8bcd58c7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap15758bd1-cf') {{(pid=71474) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 21 14:01:14 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:01:14 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap15758bd1-cf, bridge=br-int, if_exists=True) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 14:01:14 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 14:01:14 user nova-compute[71474]: INFO os_vif [None req-16397692-199a-407a-b8c2-80319b74e6a3 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:19:e6:4f,bridge_name='br-int',has_traffic_filtering=True,id=15758bd1-cf67-4bc6-9408-9740cd79d26d,network=Network(ee3596e4-eb99-458a-b7ae-a48f8bcd58c7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap15758bd1-cf') Apr 21 14:01:14 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-16397692-199a-407a-b8c2-80319b74e6a3 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] [instance: 2ae07df3-4bf4-44a5-a772-3507a6dde6ab] Deleting instance files /opt/stack/data/nova/instances/2ae07df3-4bf4-44a5-a772-3507a6dde6ab_del Apr 21 14:01:14 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-16397692-199a-407a-b8c2-80319b74e6a3 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] [instance: 2ae07df3-4bf4-44a5-a772-3507a6dde6ab] Deletion of /opt/stack/data/nova/instances/2ae07df3-4bf4-44a5-a772-3507a6dde6ab_del complete Apr 21 14:01:14 user nova-compute[71474]: INFO nova.compute.manager [None req-16397692-199a-407a-b8c2-80319b74e6a3 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] [instance: 2ae07df3-4bf4-44a5-a772-3507a6dde6ab] Took 0.87 seconds to destroy the instance on the hypervisor. Apr 21 14:01:14 user nova-compute[71474]: DEBUG oslo.service.loopingcall [None req-16397692-199a-407a-b8c2-80319b74e6a3 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71474) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 21 14:01:14 user nova-compute[71474]: DEBUG nova.compute.manager [-] [instance: 2ae07df3-4bf4-44a5-a772-3507a6dde6ab] Deallocating network for instance {{(pid=71474) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 21 14:01:14 user nova-compute[71474]: DEBUG nova.network.neutron [-] [instance: 2ae07df3-4bf4-44a5-a772-3507a6dde6ab] deallocate_for_instance() {{(pid=71474) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 21 14:01:14 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:01:14 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:01:14 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:01:14 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:01:14 user nova-compute[71474]: DEBUG nova.network.neutron [-] [instance: 2ae07df3-4bf4-44a5-a772-3507a6dde6ab] Updating instance_info_cache with network_info: [] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:01:14 user nova-compute[71474]: INFO nova.compute.manager [-] [instance: 2ae07df3-4bf4-44a5-a772-3507a6dde6ab] Took 0.65 seconds to deallocate network for instance. Apr 21 14:01:14 user nova-compute[71474]: DEBUG nova.compute.manager [req-31b8d6f3-0139-4f33-90ce-cc4172d96a04 req-5cd470e6-66e5-400a-8304-0f9640b4973d service nova] [instance: 2ae07df3-4bf4-44a5-a772-3507a6dde6ab] Received event network-vif-deleted-15758bd1-cf67-4bc6-9408-9740cd79d26d {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:01:14 user nova-compute[71474]: INFO nova.compute.manager [req-31b8d6f3-0139-4f33-90ce-cc4172d96a04 req-5cd470e6-66e5-400a-8304-0f9640b4973d service nova] [instance: 2ae07df3-4bf4-44a5-a772-3507a6dde6ab] Neutron deleted interface 15758bd1-cf67-4bc6-9408-9740cd79d26d; detaching it from the instance and deleting it from the info cache Apr 21 14:01:14 user nova-compute[71474]: DEBUG nova.network.neutron [req-31b8d6f3-0139-4f33-90ce-cc4172d96a04 req-5cd470e6-66e5-400a-8304-0f9640b4973d service nova] [instance: 2ae07df3-4bf4-44a5-a772-3507a6dde6ab] Updating instance_info_cache with network_info: [] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:01:14 user nova-compute[71474]: DEBUG nova.compute.manager [req-31b8d6f3-0139-4f33-90ce-cc4172d96a04 req-5cd470e6-66e5-400a-8304-0f9640b4973d service nova] [instance: 2ae07df3-4bf4-44a5-a772-3507a6dde6ab] Detach interface failed, port_id=15758bd1-cf67-4bc6-9408-9740cd79d26d, reason: Instance 2ae07df3-4bf4-44a5-a772-3507a6dde6ab could not be found. {{(pid=71474) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:10816}} Apr 21 14:01:14 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-16397692-199a-407a-b8c2-80319b74e6a3 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:01:14 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-16397692-199a-407a-b8c2-80319b74e6a3 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.002s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:01:15 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-16397692-199a-407a-b8c2-80319b74e6a3 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] Inventory has not changed in ProviderTree for provider: 4e62c1ab-67bb-43ed-8389-61deb50e98d7 {{(pid=71474) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 14:01:15 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-16397692-199a-407a-b8c2-80319b74e6a3 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] Inventory has not changed for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 14:01:15 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-16397692-199a-407a-b8c2-80319b74e6a3 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.398s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:01:15 user nova-compute[71474]: INFO nova.scheduler.client.report [None req-16397692-199a-407a-b8c2-80319b74e6a3 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] Deleted allocations for instance 2ae07df3-4bf4-44a5-a772-3507a6dde6ab Apr 21 14:01:15 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-16397692-199a-407a-b8c2-80319b74e6a3 tempest-SnapshotDataIntegrityTests-1600761065 tempest-SnapshotDataIntegrityTests-1600761065-project-member] Lock "2ae07df3-4bf4-44a5-a772-3507a6dde6ab" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.128s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:01:15 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:01:15 user nova-compute[71474]: DEBUG nova.compute.manager [req-9ca314ce-0ec6-4ec0-aef6-231e873a6390 req-b0a0f3f9-8d90-424c-9474-a8c2b4d85019 service nova] [instance: 2ae07df3-4bf4-44a5-a772-3507a6dde6ab] Received event network-vif-plugged-15758bd1-cf67-4bc6-9408-9740cd79d26d {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:01:15 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-9ca314ce-0ec6-4ec0-aef6-231e873a6390 req-b0a0f3f9-8d90-424c-9474-a8c2b4d85019 service nova] Acquiring lock "2ae07df3-4bf4-44a5-a772-3507a6dde6ab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:01:15 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-9ca314ce-0ec6-4ec0-aef6-231e873a6390 req-b0a0f3f9-8d90-424c-9474-a8c2b4d85019 service nova] Lock "2ae07df3-4bf4-44a5-a772-3507a6dde6ab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:01:15 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-9ca314ce-0ec6-4ec0-aef6-231e873a6390 req-b0a0f3f9-8d90-424c-9474-a8c2b4d85019 service nova] Lock "2ae07df3-4bf4-44a5-a772-3507a6dde6ab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:01:15 user nova-compute[71474]: DEBUG nova.compute.manager [req-9ca314ce-0ec6-4ec0-aef6-231e873a6390 req-b0a0f3f9-8d90-424c-9474-a8c2b4d85019 service nova] [instance: 2ae07df3-4bf4-44a5-a772-3507a6dde6ab] No waiting events found dispatching network-vif-plugged-15758bd1-cf67-4bc6-9408-9740cd79d26d {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:01:15 user nova-compute[71474]: WARNING nova.compute.manager [req-9ca314ce-0ec6-4ec0-aef6-231e873a6390 req-b0a0f3f9-8d90-424c-9474-a8c2b4d85019 service nova] [instance: 2ae07df3-4bf4-44a5-a772-3507a6dde6ab] Received unexpected event network-vif-plugged-15758bd1-cf67-4bc6-9408-9740cd79d26d for instance with vm_state deleted and task_state None. Apr 21 14:01:17 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:01:19 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:01:20 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:01:20 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:01:24 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:01:25 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:01:29 user nova-compute[71474]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71474) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 14:01:29 user nova-compute[71474]: INFO nova.compute.manager [-] [instance: 2ae07df3-4bf4-44a5-a772-3507a6dde6ab] VM Stopped (Lifecycle Event) Apr 21 14:01:29 user nova-compute[71474]: DEBUG nova.compute.manager [None req-22049329-bd98-4ba1-a94a-5c11817cb3dc None None] [instance: 2ae07df3-4bf4-44a5-a772-3507a6dde6ab] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 14:01:29 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:01:30 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:01:34 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:01:35 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:01:39 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:01:44 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:01:45 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:01:45 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:01:49 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:01:49 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-4390cccd-f130-4fd0-a998-bfa1dd1e5a3b tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Acquiring lock "4f8622ba-dea6-454f-90c8-1f5f6a56e0b4" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:01:49 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-4390cccd-f130-4fd0-a998-bfa1dd1e5a3b tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Lock "4f8622ba-dea6-454f-90c8-1f5f6a56e0b4" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:01:49 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-4390cccd-f130-4fd0-a998-bfa1dd1e5a3b tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Acquiring lock "4f8622ba-dea6-454f-90c8-1f5f6a56e0b4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:01:49 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-4390cccd-f130-4fd0-a998-bfa1dd1e5a3b tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Lock "4f8622ba-dea6-454f-90c8-1f5f6a56e0b4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:01:49 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-4390cccd-f130-4fd0-a998-bfa1dd1e5a3b tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Lock "4f8622ba-dea6-454f-90c8-1f5f6a56e0b4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:01:49 user nova-compute[71474]: INFO nova.compute.manager [None req-4390cccd-f130-4fd0-a998-bfa1dd1e5a3b tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] [instance: 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4] Terminating instance Apr 21 14:01:49 user nova-compute[71474]: DEBUG nova.compute.manager [None req-4390cccd-f130-4fd0-a998-bfa1dd1e5a3b tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] [instance: 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4] Start destroying the instance on the hypervisor. {{(pid=71474) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 21 14:01:49 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:01:49 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:01:49 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:01:49 user nova-compute[71474]: DEBUG nova.compute.manager [req-941e7bee-2ca2-4e8a-8c75-61e1d17006d2 req-a1d733df-8b99-4926-86cb-1be3c11bbb94 service nova] [instance: 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4] Received event network-vif-unplugged-9d4913bf-46f1-4c09-a062-803300bbed23 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:01:49 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-941e7bee-2ca2-4e8a-8c75-61e1d17006d2 req-a1d733df-8b99-4926-86cb-1be3c11bbb94 service nova] Acquiring lock "4f8622ba-dea6-454f-90c8-1f5f6a56e0b4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:01:49 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-941e7bee-2ca2-4e8a-8c75-61e1d17006d2 req-a1d733df-8b99-4926-86cb-1be3c11bbb94 service nova] Lock "4f8622ba-dea6-454f-90c8-1f5f6a56e0b4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:01:49 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-941e7bee-2ca2-4e8a-8c75-61e1d17006d2 req-a1d733df-8b99-4926-86cb-1be3c11bbb94 service nova] Lock "4f8622ba-dea6-454f-90c8-1f5f6a56e0b4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:01:49 user nova-compute[71474]: DEBUG nova.compute.manager [req-941e7bee-2ca2-4e8a-8c75-61e1d17006d2 req-a1d733df-8b99-4926-86cb-1be3c11bbb94 service nova] [instance: 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4] No waiting events found dispatching network-vif-unplugged-9d4913bf-46f1-4c09-a062-803300bbed23 {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:01:49 user nova-compute[71474]: DEBUG nova.compute.manager [req-941e7bee-2ca2-4e8a-8c75-61e1d17006d2 req-a1d733df-8b99-4926-86cb-1be3c11bbb94 service nova] [instance: 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4] Received event network-vif-unplugged-9d4913bf-46f1-4c09-a062-803300bbed23 for instance with task_state deleting. {{(pid=71474) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 21 14:01:50 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:01:50 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:01:50 user nova-compute[71474]: INFO nova.virt.libvirt.driver [-] [instance: 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4] Instance destroyed successfully. Apr 21 14:01:50 user nova-compute[71474]: DEBUG nova.objects.instance [None req-4390cccd-f130-4fd0-a998-bfa1dd1e5a3b tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Lazy-loading 'resources' on Instance uuid 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4 {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 14:01:50 user nova-compute[71474]: DEBUG nova.virt.libvirt.vif [None req-4390cccd-f130-4fd0-a998-bfa1dd1e5a3b tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T13:59:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-911731909',display_name='tempest-ServersNegativeTestJSON-server-911731909',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serversnegativetestjson-server-911731909',id=10,image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-21T14:00:05Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='a8c210480b33473c91156b798bcbd8b2',ramdisk_id='',reservation_id='r-z0r56r24',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServersNegativeTestJSON-1552178734',owner_user_name='tempest-ServersNegativeTestJSON-1552178734-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-21T14:00:06Z,user_data=None,user_id='2259f365261c49b28b56ddd1c27c125d',uuid=4f8622ba-dea6-454f-90c8-1f5f6a56e0b4,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9d4913bf-46f1-4c09-a062-803300bbed23", "address": "fa:16:3e:eb:54:b2", "network": {"id": "d567294b-c36b-4268-af90-17560e0c43e4", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1033838809-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a8c210480b33473c91156b798bcbd8b2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d4913bf-46", "ovs_interfaceid": "9d4913bf-46f1-4c09-a062-803300bbed23", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 21 14:01:50 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-4390cccd-f130-4fd0-a998-bfa1dd1e5a3b tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Converting VIF {"id": "9d4913bf-46f1-4c09-a062-803300bbed23", "address": "fa:16:3e:eb:54:b2", "network": {"id": "d567294b-c36b-4268-af90-17560e0c43e4", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1033838809-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a8c210480b33473c91156b798bcbd8b2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d4913bf-46", "ovs_interfaceid": "9d4913bf-46f1-4c09-a062-803300bbed23", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 14:01:50 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-4390cccd-f130-4fd0-a998-bfa1dd1e5a3b tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:eb:54:b2,bridge_name='br-int',has_traffic_filtering=True,id=9d4913bf-46f1-4c09-a062-803300bbed23,network=Network(d567294b-c36b-4268-af90-17560e0c43e4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9d4913bf-46') {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 14:01:50 user nova-compute[71474]: DEBUG os_vif [None req-4390cccd-f130-4fd0-a998-bfa1dd1e5a3b tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:eb:54:b2,bridge_name='br-int',has_traffic_filtering=True,id=9d4913bf-46f1-4c09-a062-803300bbed23,network=Network(d567294b-c36b-4268-af90-17560e0c43e4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9d4913bf-46') {{(pid=71474) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 21 14:01:50 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:01:50 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9d4913bf-46, bridge=br-int, if_exists=True) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 14:01:50 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:01:50 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 14:01:50 user nova-compute[71474]: INFO os_vif [None req-4390cccd-f130-4fd0-a998-bfa1dd1e5a3b tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:eb:54:b2,bridge_name='br-int',has_traffic_filtering=True,id=9d4913bf-46f1-4c09-a062-803300bbed23,network=Network(d567294b-c36b-4268-af90-17560e0c43e4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9d4913bf-46') Apr 21 14:01:50 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-4390cccd-f130-4fd0-a998-bfa1dd1e5a3b tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] [instance: 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4] Deleting instance files /opt/stack/data/nova/instances/4f8622ba-dea6-454f-90c8-1f5f6a56e0b4_del Apr 21 14:01:50 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-4390cccd-f130-4fd0-a998-bfa1dd1e5a3b tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] [instance: 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4] Deletion of /opt/stack/data/nova/instances/4f8622ba-dea6-454f-90c8-1f5f6a56e0b4_del complete Apr 21 14:01:50 user nova-compute[71474]: INFO nova.compute.manager [None req-4390cccd-f130-4fd0-a998-bfa1dd1e5a3b tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] [instance: 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4] Took 0.89 seconds to destroy the instance on the hypervisor. Apr 21 14:01:50 user nova-compute[71474]: DEBUG oslo.service.loopingcall [None req-4390cccd-f130-4fd0-a998-bfa1dd1e5a3b tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71474) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 21 14:01:50 user nova-compute[71474]: DEBUG nova.compute.manager [-] [instance: 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4] Deallocating network for instance {{(pid=71474) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 21 14:01:50 user nova-compute[71474]: DEBUG nova.network.neutron [-] [instance: 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4] deallocate_for_instance() {{(pid=71474) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 21 14:01:50 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:01:50 user nova-compute[71474]: DEBUG nova.network.neutron [-] [instance: 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4] Updating instance_info_cache with network_info: [] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:01:50 user nova-compute[71474]: INFO nova.compute.manager [-] [instance: 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4] Took 0.49 seconds to deallocate network for instance. Apr 21 14:01:51 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-4390cccd-f130-4fd0-a998-bfa1dd1e5a3b tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:01:51 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-4390cccd-f130-4fd0-a998-bfa1dd1e5a3b tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:01:51 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-4390cccd-f130-4fd0-a998-bfa1dd1e5a3b tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Inventory has not changed in ProviderTree for provider: 4e62c1ab-67bb-43ed-8389-61deb50e98d7 {{(pid=71474) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 14:01:51 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-4390cccd-f130-4fd0-a998-bfa1dd1e5a3b tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Inventory has not changed for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 14:01:51 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-4390cccd-f130-4fd0-a998-bfa1dd1e5a3b tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.288s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:01:51 user nova-compute[71474]: INFO nova.scheduler.client.report [None req-4390cccd-f130-4fd0-a998-bfa1dd1e5a3b tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Deleted allocations for instance 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4 Apr 21 14:01:51 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-4390cccd-f130-4fd0-a998-bfa1dd1e5a3b tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Lock "4f8622ba-dea6-454f-90c8-1f5f6a56e0b4" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.913s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:01:51 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:01:51 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:01:51 user nova-compute[71474]: DEBUG nova.compute.manager [req-59aeaba1-7c23-4ab5-9e7b-714d0e6e6ed6 req-3747303a-82b4-415c-aa36-b148698bd431 service nova] [instance: 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4] Received event network-vif-plugged-9d4913bf-46f1-4c09-a062-803300bbed23 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:01:51 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-59aeaba1-7c23-4ab5-9e7b-714d0e6e6ed6 req-3747303a-82b4-415c-aa36-b148698bd431 service nova] Acquiring lock "4f8622ba-dea6-454f-90c8-1f5f6a56e0b4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:01:51 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-59aeaba1-7c23-4ab5-9e7b-714d0e6e6ed6 req-3747303a-82b4-415c-aa36-b148698bd431 service nova] Lock "4f8622ba-dea6-454f-90c8-1f5f6a56e0b4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:01:51 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-59aeaba1-7c23-4ab5-9e7b-714d0e6e6ed6 req-3747303a-82b4-415c-aa36-b148698bd431 service nova] Lock "4f8622ba-dea6-454f-90c8-1f5f6a56e0b4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:01:51 user nova-compute[71474]: DEBUG nova.compute.manager [req-59aeaba1-7c23-4ab5-9e7b-714d0e6e6ed6 req-3747303a-82b4-415c-aa36-b148698bd431 service nova] [instance: 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4] No waiting events found dispatching network-vif-plugged-9d4913bf-46f1-4c09-a062-803300bbed23 {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:01:51 user nova-compute[71474]: WARNING nova.compute.manager [req-59aeaba1-7c23-4ab5-9e7b-714d0e6e6ed6 req-3747303a-82b4-415c-aa36-b148698bd431 service nova] [instance: 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4] Received unexpected event network-vif-plugged-9d4913bf-46f1-4c09-a062-803300bbed23 for instance with vm_state deleted and task_state None. Apr 21 14:01:51 user nova-compute[71474]: DEBUG nova.compute.manager [req-59aeaba1-7c23-4ab5-9e7b-714d0e6e6ed6 req-3747303a-82b4-415c-aa36-b148698bd431 service nova] [instance: 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4] Received event network-vif-deleted-9d4913bf-46f1-4c09-a062-803300bbed23 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:01:52 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:01:52 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:01:52 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:01:52 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:01:52 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Auditing locally available compute resources for user (node: user) {{(pid=71474) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 21 14:01:53 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/9164203a-8a6b-4078-bd98-c5ea7bc111fa/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:01:53 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/9164203a-8a6b-4078-bd98-c5ea7bc111fa/disk --force-share --output=json" returned: 0 in 0.173s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:01:53 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/9164203a-8a6b-4078-bd98-c5ea7bc111fa/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:01:53 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/9164203a-8a6b-4078-bd98-c5ea7bc111fa/disk --force-share --output=json" returned: 0 in 0.138s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:01:53 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/96ecc039-866c-4a11-969f-cb59bd0a4f66/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:01:53 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/96ecc039-866c-4a11-969f-cb59bd0a4f66/disk --force-share --output=json" returned: 0 in 0.148s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:01:53 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/96ecc039-866c-4a11-969f-cb59bd0a4f66/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:01:53 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/96ecc039-866c-4a11-969f-cb59bd0a4f66/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:01:53 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ef0a7b15-eab4-4705-9f70-9c9117736eb1/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:01:53 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f00ab4ff-442e-4e8f-9a35-431c39aa8c1c tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Acquiring lock "3fdceb24-27b8-45d3-93b2-6d68decc26b6" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:01:53 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f00ab4ff-442e-4e8f-9a35-431c39aa8c1c tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Lock "3fdceb24-27b8-45d3-93b2-6d68decc26b6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.002s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:01:53 user nova-compute[71474]: DEBUG nova.compute.manager [None req-f00ab4ff-442e-4e8f-9a35-431c39aa8c1c tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 3fdceb24-27b8-45d3-93b2-6d68decc26b6] Starting instance... {{(pid=71474) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 21 14:01:53 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ef0a7b15-eab4-4705-9f70-9c9117736eb1/disk --force-share --output=json" returned: 0 in 0.151s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:01:53 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ef0a7b15-eab4-4705-9f70-9c9117736eb1/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:01:53 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f00ab4ff-442e-4e8f-9a35-431c39aa8c1c tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:01:53 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f00ab4ff-442e-4e8f-9a35-431c39aa8c1c tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.002s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:01:53 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-f00ab4ff-442e-4e8f-9a35-431c39aa8c1c tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71474) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 21 14:01:53 user nova-compute[71474]: INFO nova.compute.claims [None req-f00ab4ff-442e-4e8f-9a35-431c39aa8c1c tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 3fdceb24-27b8-45d3-93b2-6d68decc26b6] Claim successful on node user Apr 21 14:01:53 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ef0a7b15-eab4-4705-9f70-9c9117736eb1/disk --force-share --output=json" returned: 0 in 0.149s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:01:53 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2c5afe45-87ae-477a-8bf0-6a5e2036fb68/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:01:54 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2c5afe45-87ae-477a-8bf0-6a5e2036fb68/disk --force-share --output=json" returned: 0 in 0.144s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:01:54 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2c5afe45-87ae-477a-8bf0-6a5e2036fb68/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:01:54 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2c5afe45-87ae-477a-8bf0-6a5e2036fb68/disk --force-share --output=json" returned: 0 in 0.134s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:01:54 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5e502c4c-a46b-4670-acba-2fda2d05adf5/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:01:54 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5e502c4c-a46b-4670-acba-2fda2d05adf5/disk --force-share --output=json" returned: 0 in 0.137s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:01:54 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5e502c4c-a46b-4670-acba-2fda2d05adf5/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:01:54 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-f00ab4ff-442e-4e8f-9a35-431c39aa8c1c tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Inventory has not changed in ProviderTree for provider: 4e62c1ab-67bb-43ed-8389-61deb50e98d7 {{(pid=71474) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 14:01:54 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-f00ab4ff-442e-4e8f-9a35-431c39aa8c1c tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Inventory has not changed for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 14:01:54 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f00ab4ff-442e-4e8f-9a35-431c39aa8c1c tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.572s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:01:54 user nova-compute[71474]: DEBUG nova.compute.manager [None req-f00ab4ff-442e-4e8f-9a35-431c39aa8c1c tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 3fdceb24-27b8-45d3-93b2-6d68decc26b6] Start building networks asynchronously for instance. {{(pid=71474) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 21 14:01:54 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5e502c4c-a46b-4670-acba-2fda2d05adf5/disk --force-share --output=json" returned: 0 in 0.130s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:01:54 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/30068c4a-94ed-4b84-9178-0d554326fc68/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:01:54 user nova-compute[71474]: DEBUG nova.compute.manager [None req-f00ab4ff-442e-4e8f-9a35-431c39aa8c1c tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 3fdceb24-27b8-45d3-93b2-6d68decc26b6] Allocating IP information in the background. {{(pid=71474) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 21 14:01:54 user nova-compute[71474]: DEBUG nova.network.neutron [None req-f00ab4ff-442e-4e8f-9a35-431c39aa8c1c tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 3fdceb24-27b8-45d3-93b2-6d68decc26b6] allocate_for_instance() {{(pid=71474) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 21 14:01:54 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-f00ab4ff-442e-4e8f-9a35-431c39aa8c1c tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 3fdceb24-27b8-45d3-93b2-6d68decc26b6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 21 14:01:54 user nova-compute[71474]: DEBUG nova.compute.manager [None req-f00ab4ff-442e-4e8f-9a35-431c39aa8c1c tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 3fdceb24-27b8-45d3-93b2-6d68decc26b6] Start building block device mappings for instance. {{(pid=71474) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 21 14:01:54 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/30068c4a-94ed-4b84-9178-0d554326fc68/disk --force-share --output=json" returned: 0 in 0.131s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:01:54 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/30068c4a-94ed-4b84-9178-0d554326fc68/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:01:54 user nova-compute[71474]: INFO nova.virt.block_device [None req-f00ab4ff-442e-4e8f-9a35-431c39aa8c1c tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 3fdceb24-27b8-45d3-93b2-6d68decc26b6] Booting with blank volume at /dev/vda Apr 21 14:01:54 user nova-compute[71474]: DEBUG nova.policy [None req-f00ab4ff-442e-4e8f-9a35-431c39aa8c1c tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '132913991f8c45c1adaf5db7ef7cea30', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '885cdc1521a14985bfa70ae21e73c693', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71474) authorize /opt/stack/nova/nova/policy.py:203}} Apr 21 14:01:54 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/30068c4a-94ed-4b84-9178-0d554326fc68/disk --force-share --output=json" returned: 0 in 0.140s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:01:54 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f0f32b68-6993-4843-bcc6-bd0e06377b27/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:01:54 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f0f32b68-6993-4843-bcc6-bd0e06377b27/disk --force-share --output=json" returned: 0 in 0.140s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:01:54 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f0f32b68-6993-4843-bcc6-bd0e06377b27/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:01:55 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f0f32b68-6993-4843-bcc6-bd0e06377b27/disk --force-share --output=json" returned: 0 in 0.131s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:01:55 user nova-compute[71474]: WARNING nova.compute.manager [None req-f00ab4ff-442e-4e8f-9a35-431c39aa8c1c tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Volume id: 5217d76e-296c-489f-8740-d0bdcc933c54 finished being created but its status is error. Apr 21 14:01:55 user nova-compute[71474]: ERROR nova.compute.manager [None req-f00ab4ff-442e-4e8f-9a35-431c39aa8c1c tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 3fdceb24-27b8-45d3-93b2-6d68decc26b6] Instance failed block device setup: nova.exception.VolumeNotCreated: Volume 5217d76e-296c-489f-8740-d0bdcc933c54 did not finish being created even after we waited 0 seconds or 1 attempts. And its status is error. Apr 21 14:01:55 user nova-compute[71474]: ERROR nova.compute.manager [instance: 3fdceb24-27b8-45d3-93b2-6d68decc26b6] Traceback (most recent call last): Apr 21 14:01:55 user nova-compute[71474]: ERROR nova.compute.manager [instance: 3fdceb24-27b8-45d3-93b2-6d68decc26b6] File "/opt/stack/nova/nova/compute/manager.py", line 2175, in _prep_block_device Apr 21 14:01:55 user nova-compute[71474]: ERROR nova.compute.manager [instance: 3fdceb24-27b8-45d3-93b2-6d68decc26b6] driver_block_device.attach_block_devices( Apr 21 14:01:55 user nova-compute[71474]: ERROR nova.compute.manager [instance: 3fdceb24-27b8-45d3-93b2-6d68decc26b6] File "/opt/stack/nova/nova/virt/block_device.py", line 936, in attach_block_devices Apr 21 14:01:55 user nova-compute[71474]: ERROR nova.compute.manager [instance: 3fdceb24-27b8-45d3-93b2-6d68decc26b6] _log_and_attach(device) Apr 21 14:01:55 user nova-compute[71474]: ERROR nova.compute.manager [instance: 3fdceb24-27b8-45d3-93b2-6d68decc26b6] File "/opt/stack/nova/nova/virt/block_device.py", line 933, in _log_and_attach Apr 21 14:01:55 user nova-compute[71474]: ERROR nova.compute.manager [instance: 3fdceb24-27b8-45d3-93b2-6d68decc26b6] bdm.attach(*attach_args, **attach_kwargs) Apr 21 14:01:55 user nova-compute[71474]: ERROR nova.compute.manager [instance: 3fdceb24-27b8-45d3-93b2-6d68decc26b6] File "/opt/stack/nova/nova/virt/block_device.py", line 848, in attach Apr 21 14:01:55 user nova-compute[71474]: ERROR nova.compute.manager [instance: 3fdceb24-27b8-45d3-93b2-6d68decc26b6] self.volume_id, self.attachment_id = self._create_volume( Apr 21 14:01:55 user nova-compute[71474]: ERROR nova.compute.manager [instance: 3fdceb24-27b8-45d3-93b2-6d68decc26b6] File "/opt/stack/nova/nova/virt/block_device.py", line 435, in _create_volume Apr 21 14:01:55 user nova-compute[71474]: ERROR nova.compute.manager [instance: 3fdceb24-27b8-45d3-93b2-6d68decc26b6] self._call_wait_func(context, wait_func, volume_api, vol['id']) Apr 21 14:01:55 user nova-compute[71474]: ERROR nova.compute.manager [instance: 3fdceb24-27b8-45d3-93b2-6d68decc26b6] File "/opt/stack/nova/nova/virt/block_device.py", line 785, in _call_wait_func Apr 21 14:01:55 user nova-compute[71474]: ERROR nova.compute.manager [instance: 3fdceb24-27b8-45d3-93b2-6d68decc26b6] with excutils.save_and_reraise_exception(): Apr 21 14:01:55 user nova-compute[71474]: ERROR nova.compute.manager [instance: 3fdceb24-27b8-45d3-93b2-6d68decc26b6] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ Apr 21 14:01:55 user nova-compute[71474]: ERROR nova.compute.manager [instance: 3fdceb24-27b8-45d3-93b2-6d68decc26b6] self.force_reraise() Apr 21 14:01:55 user nova-compute[71474]: ERROR nova.compute.manager [instance: 3fdceb24-27b8-45d3-93b2-6d68decc26b6] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise Apr 21 14:01:55 user nova-compute[71474]: ERROR nova.compute.manager [instance: 3fdceb24-27b8-45d3-93b2-6d68decc26b6] raise self.value Apr 21 14:01:55 user nova-compute[71474]: ERROR nova.compute.manager [instance: 3fdceb24-27b8-45d3-93b2-6d68decc26b6] File "/opt/stack/nova/nova/virt/block_device.py", line 783, in _call_wait_func Apr 21 14:01:55 user nova-compute[71474]: ERROR nova.compute.manager [instance: 3fdceb24-27b8-45d3-93b2-6d68decc26b6] wait_func(context, volume_id) Apr 21 14:01:55 user nova-compute[71474]: ERROR nova.compute.manager [instance: 3fdceb24-27b8-45d3-93b2-6d68decc26b6] File "/opt/stack/nova/nova/compute/manager.py", line 1792, in _await_block_device_map_created Apr 21 14:01:55 user nova-compute[71474]: ERROR nova.compute.manager [instance: 3fdceb24-27b8-45d3-93b2-6d68decc26b6] raise exception.VolumeNotCreated(volume_id=vol_id, Apr 21 14:01:55 user nova-compute[71474]: ERROR nova.compute.manager [instance: 3fdceb24-27b8-45d3-93b2-6d68decc26b6] nova.exception.VolumeNotCreated: Volume 5217d76e-296c-489f-8740-d0bdcc933c54 did not finish being created even after we waited 0 seconds or 1 attempts. And its status is error. Apr 21 14:01:55 user nova-compute[71474]: ERROR nova.compute.manager [instance: 3fdceb24-27b8-45d3-93b2-6d68decc26b6] Apr 21 14:01:55 user nova-compute[71474]: DEBUG nova.network.neutron [None req-f00ab4ff-442e-4e8f-9a35-431c39aa8c1c tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 3fdceb24-27b8-45d3-93b2-6d68decc26b6] Successfully created port: 4db14416-73b5-445f-9eef-7afad30c9cb7 {{(pid=71474) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 21 14:01:55 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:01:55 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 14:01:55 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 14:01:55 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Hypervisor/Node resource view: name=user free_ram=8255MB free_disk=26.047019958496094GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71474) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 21 14:01:55 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:01:55 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:01:55 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Instance 30068c4a-94ed-4b84-9178-0d554326fc68 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71474) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 14:01:55 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Instance 2c5afe45-87ae-477a-8bf0-6a5e2036fb68 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71474) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 14:01:55 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Instance f0f32b68-6993-4843-bcc6-bd0e06377b27 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71474) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 14:01:55 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Instance 5e502c4c-a46b-4670-acba-2fda2d05adf5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71474) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 14:01:55 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Instance 9164203a-8a6b-4078-bd98-c5ea7bc111fa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71474) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 14:01:55 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Instance ef0a7b15-eab4-4705-9f70-9c9117736eb1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71474) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 14:01:55 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Instance 96ecc039-866c-4a11-969f-cb59bd0a4f66 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71474) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 14:01:55 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Instance 3fdceb24-27b8-45d3-93b2-6d68decc26b6 actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71474) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 14:01:55 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Total usable vcpus: 12, total allocated vcpus: 8 {{(pid=71474) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 21 14:01:55 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Final resource view: name=user phys_ram=16023MB used_ram=1536MB phys_disk=40GB used_disk=7GB total_vcpus=12 used_vcpus=8 pci_stats=[] {{(pid=71474) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 21 14:01:55 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Inventory has not changed in ProviderTree for provider: 4e62c1ab-67bb-43ed-8389-61deb50e98d7 {{(pid=71474) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 14:01:55 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Inventory has not changed for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 14:01:55 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Compute_service record updated for user:user {{(pid=71474) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 21 14:01:55 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.413s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:01:56 user nova-compute[71474]: DEBUG nova.network.neutron [None req-f00ab4ff-442e-4e8f-9a35-431c39aa8c1c tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 3fdceb24-27b8-45d3-93b2-6d68decc26b6] Successfully updated port: 4db14416-73b5-445f-9eef-7afad30c9cb7 {{(pid=71474) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 21 14:01:56 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f00ab4ff-442e-4e8f-9a35-431c39aa8c1c tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Acquiring lock "refresh_cache-3fdceb24-27b8-45d3-93b2-6d68decc26b6" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 14:01:56 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f00ab4ff-442e-4e8f-9a35-431c39aa8c1c tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Acquired lock "refresh_cache-3fdceb24-27b8-45d3-93b2-6d68decc26b6" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 14:01:56 user nova-compute[71474]: DEBUG nova.network.neutron [None req-f00ab4ff-442e-4e8f-9a35-431c39aa8c1c tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 3fdceb24-27b8-45d3-93b2-6d68decc26b6] Building network info cache for instance {{(pid=71474) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 21 14:01:56 user nova-compute[71474]: DEBUG nova.compute.manager [req-f94cb0cb-8a71-4876-951c-9ada61be9901 req-09261d93-9b6e-44df-b5fe-633c784c7cc6 service nova] [instance: 3fdceb24-27b8-45d3-93b2-6d68decc26b6] Received event network-changed-4db14416-73b5-445f-9eef-7afad30c9cb7 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:01:56 user nova-compute[71474]: DEBUG nova.compute.manager [req-f94cb0cb-8a71-4876-951c-9ada61be9901 req-09261d93-9b6e-44df-b5fe-633c784c7cc6 service nova] [instance: 3fdceb24-27b8-45d3-93b2-6d68decc26b6] Refreshing instance network info cache due to event network-changed-4db14416-73b5-445f-9eef-7afad30c9cb7. {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 21 14:01:56 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-f94cb0cb-8a71-4876-951c-9ada61be9901 req-09261d93-9b6e-44df-b5fe-633c784c7cc6 service nova] Acquiring lock "refresh_cache-3fdceb24-27b8-45d3-93b2-6d68decc26b6" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 14:01:56 user nova-compute[71474]: DEBUG nova.network.neutron [None req-f00ab4ff-442e-4e8f-9a35-431c39aa8c1c tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 3fdceb24-27b8-45d3-93b2-6d68decc26b6] Instance cache missing network info. {{(pid=71474) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 21 14:01:56 user nova-compute[71474]: DEBUG nova.network.neutron [None req-f00ab4ff-442e-4e8f-9a35-431c39aa8c1c tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 3fdceb24-27b8-45d3-93b2-6d68decc26b6] Updating instance_info_cache with network_info: [{"id": "4db14416-73b5-445f-9eef-7afad30c9cb7", "address": "fa:16:3e:eb:7d:fe", "network": {"id": "4b38afb7-2b53-44fc-a4e0-7d79bef71734", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-935140606-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "885cdc1521a14985bfa70ae21e73c693", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4db14416-73", "ovs_interfaceid": "4db14416-73b5-445f-9eef-7afad30c9cb7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:01:56 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f00ab4ff-442e-4e8f-9a35-431c39aa8c1c tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Releasing lock "refresh_cache-3fdceb24-27b8-45d3-93b2-6d68decc26b6" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 14:01:56 user nova-compute[71474]: DEBUG nova.compute.manager [None req-f00ab4ff-442e-4e8f-9a35-431c39aa8c1c tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 3fdceb24-27b8-45d3-93b2-6d68decc26b6] Instance network_info: |[{"id": "4db14416-73b5-445f-9eef-7afad30c9cb7", "address": "fa:16:3e:eb:7d:fe", "network": {"id": "4b38afb7-2b53-44fc-a4e0-7d79bef71734", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-935140606-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "885cdc1521a14985bfa70ae21e73c693", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4db14416-73", "ovs_interfaceid": "4db14416-73b5-445f-9eef-7afad30c9cb7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71474) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 21 14:01:56 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-f94cb0cb-8a71-4876-951c-9ada61be9901 req-09261d93-9b6e-44df-b5fe-633c784c7cc6 service nova] Acquired lock "refresh_cache-3fdceb24-27b8-45d3-93b2-6d68decc26b6" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 14:01:56 user nova-compute[71474]: DEBUG nova.network.neutron [req-f94cb0cb-8a71-4876-951c-9ada61be9901 req-09261d93-9b6e-44df-b5fe-633c784c7cc6 service nova] [instance: 3fdceb24-27b8-45d3-93b2-6d68decc26b6] Refreshing network info cache for port 4db14416-73b5-445f-9eef-7afad30c9cb7 {{(pid=71474) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 21 14:01:56 user nova-compute[71474]: DEBUG nova.compute.claims [None req-f00ab4ff-442e-4e8f-9a35-431c39aa8c1c tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 3fdceb24-27b8-45d3-93b2-6d68decc26b6] Aborting claim: {{(pid=71474) abort /opt/stack/nova/nova/compute/claims.py:84}} Apr 21 14:01:56 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f00ab4ff-442e-4e8f-9a35-431c39aa8c1c tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:01:56 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f00ab4ff-442e-4e8f-9a35-431c39aa8c1c tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:01:56 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-f00ab4ff-442e-4e8f-9a35-431c39aa8c1c tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Inventory has not changed in ProviderTree for provider: 4e62c1ab-67bb-43ed-8389-61deb50e98d7 {{(pid=71474) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 14:01:56 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-f00ab4ff-442e-4e8f-9a35-431c39aa8c1c tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Inventory has not changed for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 14:01:56 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:01:56 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:01:56 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:01:56 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Starting heal instance info cache {{(pid=71474) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 21 14:01:57 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f00ab4ff-442e-4e8f-9a35-431c39aa8c1c tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.345s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:01:57 user nova-compute[71474]: DEBUG nova.compute.manager [None req-f00ab4ff-442e-4e8f-9a35-431c39aa8c1c tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 3fdceb24-27b8-45d3-93b2-6d68decc26b6] Build of instance 3fdceb24-27b8-45d3-93b2-6d68decc26b6 aborted: Volume 5217d76e-296c-489f-8740-d0bdcc933c54 did not finish being created even after we waited 0 seconds or 1 attempts. And its status is error. {{(pid=71474) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2636}} Apr 21 14:01:57 user nova-compute[71474]: DEBUG nova.compute.utils [None req-f00ab4ff-442e-4e8f-9a35-431c39aa8c1c tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 3fdceb24-27b8-45d3-93b2-6d68decc26b6] Build of instance 3fdceb24-27b8-45d3-93b2-6d68decc26b6 aborted: Volume 5217d76e-296c-489f-8740-d0bdcc933c54 did not finish being created even after we waited 0 seconds or 1 attempts. And its status is error. {{(pid=71474) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} Apr 21 14:01:57 user nova-compute[71474]: ERROR nova.compute.manager [None req-f00ab4ff-442e-4e8f-9a35-431c39aa8c1c tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 3fdceb24-27b8-45d3-93b2-6d68decc26b6] Build of instance 3fdceb24-27b8-45d3-93b2-6d68decc26b6 aborted: Volume 5217d76e-296c-489f-8740-d0bdcc933c54 did not finish being created even after we waited 0 seconds or 1 attempts. And its status is error.: nova.exception.BuildAbortException: Build of instance 3fdceb24-27b8-45d3-93b2-6d68decc26b6 aborted: Volume 5217d76e-296c-489f-8740-d0bdcc933c54 did not finish being created even after we waited 0 seconds or 1 attempts. And its status is error. Apr 21 14:01:57 user nova-compute[71474]: DEBUG nova.compute.manager [None req-f00ab4ff-442e-4e8f-9a35-431c39aa8c1c tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 3fdceb24-27b8-45d3-93b2-6d68decc26b6] Unplugging VIFs for instance {{(pid=71474) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2961}} Apr 21 14:01:57 user nova-compute[71474]: DEBUG nova.virt.libvirt.vif [None req-f00ab4ff-442e-4e8f-9a35-431c39aa8c1c tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T14:01:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-424935120',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverbootfromvolumestablerescuetest-server-424935120',id=14,image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='885cdc1521a14985bfa70ae21e73c693',ramdisk_id='',reservation_id='r-dfp2f4gm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-28514522',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member'},tags=TagList,task_state='block_device_mapping',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T14:01:55Z,user_data=None,user_id='132913991f8c45c1adaf5db7ef7cea30',uuid=3fdceb24-27b8-45d3-93b2-6d68decc26b6,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4db14416-73b5-445f-9eef-7afad30c9cb7", "address": "fa:16:3e:eb:7d:fe", "network": {"id": "4b38afb7-2b53-44fc-a4e0-7d79bef71734", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-935140606-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "885cdc1521a14985bfa70ae21e73c693", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4db14416-73", "ovs_interfaceid": "4db14416-73b5-445f-9eef-7afad30c9cb7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 21 14:01:57 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-f00ab4ff-442e-4e8f-9a35-431c39aa8c1c tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Converting VIF {"id": "4db14416-73b5-445f-9eef-7afad30c9cb7", "address": "fa:16:3e:eb:7d:fe", "network": {"id": "4b38afb7-2b53-44fc-a4e0-7d79bef71734", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-935140606-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "885cdc1521a14985bfa70ae21e73c693", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4db14416-73", "ovs_interfaceid": "4db14416-73b5-445f-9eef-7afad30c9cb7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 14:01:57 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-f00ab4ff-442e-4e8f-9a35-431c39aa8c1c tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:eb:7d:fe,bridge_name='br-int',has_traffic_filtering=True,id=4db14416-73b5-445f-9eef-7afad30c9cb7,network=Network(4b38afb7-2b53-44fc-a4e0-7d79bef71734),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4db14416-73') {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 14:01:57 user nova-compute[71474]: DEBUG os_vif [None req-f00ab4ff-442e-4e8f-9a35-431c39aa8c1c tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:eb:7d:fe,bridge_name='br-int',has_traffic_filtering=True,id=4db14416-73b5-445f-9eef-7afad30c9cb7,network=Network(4b38afb7-2b53-44fc-a4e0-7d79bef71734),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4db14416-73') {{(pid=71474) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 21 14:01:57 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:01:57 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4db14416-73, bridge=br-int, if_exists=True) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 14:01:57 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 21 14:01:57 user nova-compute[71474]: INFO os_vif [None req-f00ab4ff-442e-4e8f-9a35-431c39aa8c1c tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:eb:7d:fe,bridge_name='br-int',has_traffic_filtering=True,id=4db14416-73b5-445f-9eef-7afad30c9cb7,network=Network(4b38afb7-2b53-44fc-a4e0-7d79bef71734),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4db14416-73') Apr 21 14:01:57 user nova-compute[71474]: DEBUG nova.compute.manager [None req-f00ab4ff-442e-4e8f-9a35-431c39aa8c1c tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 3fdceb24-27b8-45d3-93b2-6d68decc26b6] Unplugged VIFs for instance {{(pid=71474) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} Apr 21 14:01:57 user nova-compute[71474]: DEBUG nova.compute.manager [None req-f00ab4ff-442e-4e8f-9a35-431c39aa8c1c tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 3fdceb24-27b8-45d3-93b2-6d68decc26b6] Deallocating network for instance {{(pid=71474) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 21 14:01:57 user nova-compute[71474]: DEBUG nova.network.neutron [None req-f00ab4ff-442e-4e8f-9a35-431c39aa8c1c tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 3fdceb24-27b8-45d3-93b2-6d68decc26b6] deallocate_for_instance() {{(pid=71474) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 21 14:01:57 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquiring lock "refresh_cache-f0f32b68-6993-4843-bcc6-bd0e06377b27" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 14:01:57 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquired lock "refresh_cache-f0f32b68-6993-4843-bcc6-bd0e06377b27" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 14:01:57 user nova-compute[71474]: DEBUG nova.network.neutron [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] Forcefully refreshing network info cache for instance {{(pid=71474) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 21 14:01:57 user nova-compute[71474]: DEBUG nova.network.neutron [req-f94cb0cb-8a71-4876-951c-9ada61be9901 req-09261d93-9b6e-44df-b5fe-633c784c7cc6 service nova] [instance: 3fdceb24-27b8-45d3-93b2-6d68decc26b6] Updated VIF entry in instance network info cache for port 4db14416-73b5-445f-9eef-7afad30c9cb7. {{(pid=71474) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 21 14:01:57 user nova-compute[71474]: DEBUG nova.network.neutron [req-f94cb0cb-8a71-4876-951c-9ada61be9901 req-09261d93-9b6e-44df-b5fe-633c784c7cc6 service nova] [instance: 3fdceb24-27b8-45d3-93b2-6d68decc26b6] Updating instance_info_cache with network_info: [{"id": "4db14416-73b5-445f-9eef-7afad30c9cb7", "address": "fa:16:3e:eb:7d:fe", "network": {"id": "4b38afb7-2b53-44fc-a4e0-7d79bef71734", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-935140606-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "885cdc1521a14985bfa70ae21e73c693", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4db14416-73", "ovs_interfaceid": "4db14416-73b5-445f-9eef-7afad30c9cb7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:01:57 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-f94cb0cb-8a71-4876-951c-9ada61be9901 req-09261d93-9b6e-44df-b5fe-633c784c7cc6 service nova] Releasing lock "refresh_cache-3fdceb24-27b8-45d3-93b2-6d68decc26b6" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 14:01:57 user nova-compute[71474]: DEBUG nova.network.neutron [None req-f00ab4ff-442e-4e8f-9a35-431c39aa8c1c tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 3fdceb24-27b8-45d3-93b2-6d68decc26b6] Updating instance_info_cache with network_info: [] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:01:57 user nova-compute[71474]: INFO nova.compute.manager [None req-f00ab4ff-442e-4e8f-9a35-431c39aa8c1c tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 3fdceb24-27b8-45d3-93b2-6d68decc26b6] Took 0.56 seconds to deallocate network for instance. Apr 21 14:01:57 user nova-compute[71474]: DEBUG nova.network.neutron [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] Updating instance_info_cache with network_info: [{"id": "20ca5a57-3cd5-47ad-bdfe-f56a0ecd078b", "address": "fa:16:3e:02:78:94", "network": {"id": "6e372a6f-6444-4977-be86-7a6bb86d8979", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-2058149994-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.41", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "15f83d6d2c3049e9ba1ac7f04ad2ebb0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap20ca5a57-3c", "ovs_interfaceid": "20ca5a57-3cd5-47ad-bdfe-f56a0ecd078b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:01:57 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Releasing lock "refresh_cache-f0f32b68-6993-4843-bcc6-bd0e06377b27" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 14:01:57 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] Updated the network info_cache for instance {{(pid=71474) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 21 14:01:57 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:01:57 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:01:57 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:01:57 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71474) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 21 14:01:57 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:01:57 user nova-compute[71474]: INFO nova.scheduler.client.report [None req-f00ab4ff-442e-4e8f-9a35-431c39aa8c1c tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Deleted allocations for instance 3fdceb24-27b8-45d3-93b2-6d68decc26b6 Apr 21 14:01:57 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f00ab4ff-442e-4e8f-9a35-431c39aa8c1c tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Lock "3fdceb24-27b8-45d3-93b2-6d68decc26b6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 4.139s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:01:58 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:02:00 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:02:02 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:02:03 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-a4d80742-93df-4f9b-ae03-06c0011acf68 tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] Acquiring lock "2c5afe45-87ae-477a-8bf0-6a5e2036fb68" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:02:03 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-a4d80742-93df-4f9b-ae03-06c0011acf68 tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] Lock "2c5afe45-87ae-477a-8bf0-6a5e2036fb68" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:02:03 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-a4d80742-93df-4f9b-ae03-06c0011acf68 tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] Acquiring lock "2c5afe45-87ae-477a-8bf0-6a5e2036fb68-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:02:03 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-a4d80742-93df-4f9b-ae03-06c0011acf68 tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] Lock "2c5afe45-87ae-477a-8bf0-6a5e2036fb68-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:02:03 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-a4d80742-93df-4f9b-ae03-06c0011acf68 tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] Lock "2c5afe45-87ae-477a-8bf0-6a5e2036fb68-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:02:03 user nova-compute[71474]: INFO nova.compute.manager [None req-a4d80742-93df-4f9b-ae03-06c0011acf68 tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] Terminating instance Apr 21 14:02:03 user nova-compute[71474]: DEBUG nova.compute.manager [None req-a4d80742-93df-4f9b-ae03-06c0011acf68 tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] Start destroying the instance on the hypervisor. {{(pid=71474) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 21 14:02:03 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:02:04 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:02:04 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:02:04 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:02:04 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:02:04 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:02:04 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:02:04 user nova-compute[71474]: INFO nova.virt.libvirt.driver [-] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] Instance destroyed successfully. Apr 21 14:02:04 user nova-compute[71474]: DEBUG nova.objects.instance [None req-a4d80742-93df-4f9b-ae03-06c0011acf68 tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] Lazy-loading 'resources' on Instance uuid 2c5afe45-87ae-477a-8bf0-6a5e2036fb68 {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 14:02:04 user nova-compute[71474]: DEBUG nova.virt.libvirt.vif [None req-a4d80742-93df-4f9b-ae03-06c0011acf68 tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T13:58:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-1149674532',display_name='tempest-ServerStableDeviceRescueTest-server-1149674532',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverstabledevicerescuetest-server-1149674532',id=4,image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMkCCUCEByuZA8uefKTCCf6BwNOi3GFQvMJ7eA+GdJBuYKCUigOvF7jv5smuTcvHYLmZKP4LkvWhlc4WMHNO3mTFd+RXuNxX7VqhNcJysaZOOp2XhD7KgmsEEHk9+iiuvQ==',key_name='tempest-keypair-1535115748',keypairs=,launch_index=0,launched_at=2023-04-21T13:58:28Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='b4c4270d6dfa435f94da018d12586bcd',ramdisk_id='',reservation_id='r-w1rsfwl0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServerStableDeviceRescueTest-1083322898',owner_user_name='tempest-ServerStableDeviceRescueTest-1083322898-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-21T14:00:15Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0d7d1e7446af4edf8e35a9d0178b2895',uuid=2c5afe45-87ae-477a-8bf0-6a5e2036fb68,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2616f5a4-1b53-44bd-82ad-65419e2839ca", "address": "fa:16:3e:52:41:c0", "network": {"id": "43525cbd-9d02-4cd7-b457-b26a485106f5", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1453701117-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.197", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "b4c4270d6dfa435f94da018d12586bcd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2616f5a4-1b", "ovs_interfaceid": "2616f5a4-1b53-44bd-82ad-65419e2839ca", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 21 14:02:04 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-a4d80742-93df-4f9b-ae03-06c0011acf68 tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] Converting VIF {"id": "2616f5a4-1b53-44bd-82ad-65419e2839ca", "address": "fa:16:3e:52:41:c0", "network": {"id": "43525cbd-9d02-4cd7-b457-b26a485106f5", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1453701117-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.197", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "b4c4270d6dfa435f94da018d12586bcd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2616f5a4-1b", "ovs_interfaceid": "2616f5a4-1b53-44bd-82ad-65419e2839ca", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 14:02:04 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-a4d80742-93df-4f9b-ae03-06c0011acf68 tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:52:41:c0,bridge_name='br-int',has_traffic_filtering=True,id=2616f5a4-1b53-44bd-82ad-65419e2839ca,network=Network(43525cbd-9d02-4cd7-b457-b26a485106f5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2616f5a4-1b') {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 14:02:04 user nova-compute[71474]: DEBUG os_vif [None req-a4d80742-93df-4f9b-ae03-06c0011acf68 tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:52:41:c0,bridge_name='br-int',has_traffic_filtering=True,id=2616f5a4-1b53-44bd-82ad-65419e2839ca,network=Network(43525cbd-9d02-4cd7-b457-b26a485106f5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2616f5a4-1b') {{(pid=71474) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 21 14:02:04 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:02:04 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2616f5a4-1b, bridge=br-int, if_exists=True) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 14:02:04 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:02:04 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 14:02:04 user nova-compute[71474]: INFO os_vif [None req-a4d80742-93df-4f9b-ae03-06c0011acf68 tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:52:41:c0,bridge_name='br-int',has_traffic_filtering=True,id=2616f5a4-1b53-44bd-82ad-65419e2839ca,network=Network(43525cbd-9d02-4cd7-b457-b26a485106f5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2616f5a4-1b') Apr 21 14:02:04 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-a4d80742-93df-4f9b-ae03-06c0011acf68 tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] Deleting instance files /opt/stack/data/nova/instances/2c5afe45-87ae-477a-8bf0-6a5e2036fb68_del Apr 21 14:02:04 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-a4d80742-93df-4f9b-ae03-06c0011acf68 tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] Deletion of /opt/stack/data/nova/instances/2c5afe45-87ae-477a-8bf0-6a5e2036fb68_del complete Apr 21 14:02:04 user nova-compute[71474]: INFO nova.compute.manager [None req-a4d80742-93df-4f9b-ae03-06c0011acf68 tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] Took 0.87 seconds to destroy the instance on the hypervisor. Apr 21 14:02:04 user nova-compute[71474]: DEBUG oslo.service.loopingcall [None req-a4d80742-93df-4f9b-ae03-06c0011acf68 tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71474) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 21 14:02:04 user nova-compute[71474]: DEBUG nova.compute.manager [-] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] Deallocating network for instance {{(pid=71474) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 21 14:02:04 user nova-compute[71474]: DEBUG nova.network.neutron [-] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] deallocate_for_instance() {{(pid=71474) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 21 14:02:05 user nova-compute[71474]: DEBUG nova.compute.manager [req-07ae878e-068e-4153-b11a-f9ff6f75831b req-ba087b6a-341a-4cb6-aa33-8c6b033db86d service nova] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] Received event network-vif-unplugged-2616f5a4-1b53-44bd-82ad-65419e2839ca {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:02:05 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-07ae878e-068e-4153-b11a-f9ff6f75831b req-ba087b6a-341a-4cb6-aa33-8c6b033db86d service nova] Acquiring lock "2c5afe45-87ae-477a-8bf0-6a5e2036fb68-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:02:05 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-07ae878e-068e-4153-b11a-f9ff6f75831b req-ba087b6a-341a-4cb6-aa33-8c6b033db86d service nova] Lock "2c5afe45-87ae-477a-8bf0-6a5e2036fb68-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:02:05 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-07ae878e-068e-4153-b11a-f9ff6f75831b req-ba087b6a-341a-4cb6-aa33-8c6b033db86d service nova] Lock "2c5afe45-87ae-477a-8bf0-6a5e2036fb68-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:02:05 user nova-compute[71474]: DEBUG nova.compute.manager [req-07ae878e-068e-4153-b11a-f9ff6f75831b req-ba087b6a-341a-4cb6-aa33-8c6b033db86d service nova] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] No waiting events found dispatching network-vif-unplugged-2616f5a4-1b53-44bd-82ad-65419e2839ca {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:02:05 user nova-compute[71474]: DEBUG nova.compute.manager [req-07ae878e-068e-4153-b11a-f9ff6f75831b req-ba087b6a-341a-4cb6-aa33-8c6b033db86d service nova] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] Received event network-vif-unplugged-2616f5a4-1b53-44bd-82ad-65419e2839ca for instance with task_state deleting. {{(pid=71474) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 21 14:02:05 user nova-compute[71474]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71474) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 14:02:05 user nova-compute[71474]: INFO nova.compute.manager [-] [instance: 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4] VM Stopped (Lifecycle Event) Apr 21 14:02:05 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:02:05 user nova-compute[71474]: DEBUG nova.compute.manager [None req-30af5ef4-cf3b-46d3-b25a-4f6cff95dd5b None None] [instance: 4f8622ba-dea6-454f-90c8-1f5f6a56e0b4] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 14:02:05 user nova-compute[71474]: DEBUG nova.network.neutron [-] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] Updating instance_info_cache with network_info: [] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:02:05 user nova-compute[71474]: INFO nova.compute.manager [-] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] Took 1.00 seconds to deallocate network for instance. Apr 21 14:02:05 user nova-compute[71474]: DEBUG nova.compute.manager [req-dd6c227f-6386-40c2-8295-f4902b0ac84b req-133f1cf9-feb1-4049-8ddc-3667b1d3da9f service nova] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] Received event network-vif-deleted-2616f5a4-1b53-44bd-82ad-65419e2839ca {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:02:05 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-a4d80742-93df-4f9b-ae03-06c0011acf68 tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:02:05 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-a4d80742-93df-4f9b-ae03-06c0011acf68 tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:02:05 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:02:06 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-a4d80742-93df-4f9b-ae03-06c0011acf68 tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] Inventory has not changed in ProviderTree for provider: 4e62c1ab-67bb-43ed-8389-61deb50e98d7 {{(pid=71474) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 14:02:06 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-a4d80742-93df-4f9b-ae03-06c0011acf68 tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] Inventory has not changed for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 14:02:06 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-a4d80742-93df-4f9b-ae03-06c0011acf68 tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.245s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:02:06 user nova-compute[71474]: INFO nova.scheduler.client.report [None req-a4d80742-93df-4f9b-ae03-06c0011acf68 tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] Deleted allocations for instance 2c5afe45-87ae-477a-8bf0-6a5e2036fb68 Apr 21 14:02:06 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-a4d80742-93df-4f9b-ae03-06c0011acf68 tempest-ServerStableDeviceRescueTest-1083322898 tempest-ServerStableDeviceRescueTest-1083322898-project-member] Lock "2c5afe45-87ae-477a-8bf0-6a5e2036fb68" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.289s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:02:06 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:02:07 user nova-compute[71474]: DEBUG nova.compute.manager [req-eeeb2f78-87f5-4196-b1ae-0fe55425a313 req-2d64456e-8813-4a1d-82ab-70846db8fdda service nova] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] Received event network-vif-plugged-2616f5a4-1b53-44bd-82ad-65419e2839ca {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:02:07 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-eeeb2f78-87f5-4196-b1ae-0fe55425a313 req-2d64456e-8813-4a1d-82ab-70846db8fdda service nova] Acquiring lock "2c5afe45-87ae-477a-8bf0-6a5e2036fb68-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:02:07 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-eeeb2f78-87f5-4196-b1ae-0fe55425a313 req-2d64456e-8813-4a1d-82ab-70846db8fdda service nova] Lock "2c5afe45-87ae-477a-8bf0-6a5e2036fb68-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:02:07 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-eeeb2f78-87f5-4196-b1ae-0fe55425a313 req-2d64456e-8813-4a1d-82ab-70846db8fdda service nova] Lock "2c5afe45-87ae-477a-8bf0-6a5e2036fb68-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:02:07 user nova-compute[71474]: DEBUG nova.compute.manager [req-eeeb2f78-87f5-4196-b1ae-0fe55425a313 req-2d64456e-8813-4a1d-82ab-70846db8fdda service nova] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] No waiting events found dispatching network-vif-plugged-2616f5a4-1b53-44bd-82ad-65419e2839ca {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:02:07 user nova-compute[71474]: WARNING nova.compute.manager [req-eeeb2f78-87f5-4196-b1ae-0fe55425a313 req-2d64456e-8813-4a1d-82ab-70846db8fdda service nova] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] Received unexpected event network-vif-plugged-2616f5a4-1b53-44bd-82ad-65419e2839ca for instance with vm_state deleted and task_state None. Apr 21 14:02:09 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:02:09 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:02:09 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:02:10 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:02:12 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:02:14 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:02:15 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:02:17 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:02:19 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9994a6f9-f30b-499b-95f1-154db1e5ef36 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Acquiring lock "9164203a-8a6b-4078-bd98-c5ea7bc111fa" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:02:19 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9994a6f9-f30b-499b-95f1-154db1e5ef36 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Lock "9164203a-8a6b-4078-bd98-c5ea7bc111fa" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:02:19 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9994a6f9-f30b-499b-95f1-154db1e5ef36 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Acquiring lock "9164203a-8a6b-4078-bd98-c5ea7bc111fa-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:02:19 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9994a6f9-f30b-499b-95f1-154db1e5ef36 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Lock "9164203a-8a6b-4078-bd98-c5ea7bc111fa-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:02:19 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9994a6f9-f30b-499b-95f1-154db1e5ef36 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Lock "9164203a-8a6b-4078-bd98-c5ea7bc111fa-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:02:19 user nova-compute[71474]: INFO nova.compute.manager [None req-9994a6f9-f30b-499b-95f1-154db1e5ef36 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] [instance: 9164203a-8a6b-4078-bd98-c5ea7bc111fa] Terminating instance Apr 21 14:02:19 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9994a6f9-f30b-499b-95f1-154db1e5ef36 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] [instance: 9164203a-8a6b-4078-bd98-c5ea7bc111fa] Start destroying the instance on the hypervisor. {{(pid=71474) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 21 14:02:19 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:02:19 user nova-compute[71474]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71474) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 14:02:19 user nova-compute[71474]: INFO nova.compute.manager [-] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] VM Stopped (Lifecycle Event) Apr 21 14:02:19 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:02:19 user nova-compute[71474]: DEBUG nova.compute.manager [None req-356a5da1-f243-4589-b617-24116068a393 None None] [instance: 2c5afe45-87ae-477a-8bf0-6a5e2036fb68] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 14:02:19 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:02:19 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:02:19 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:02:19 user nova-compute[71474]: DEBUG nova.compute.manager [req-a9f3a002-5cec-4426-bf9c-81dab3542377 req-7bdf0993-5adc-402e-a1f4-c6661d8a03a4 service nova] [instance: 9164203a-8a6b-4078-bd98-c5ea7bc111fa] Received event network-vif-unplugged-70605424-311e-401f-b769-3e037210f46a {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:02:19 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-a9f3a002-5cec-4426-bf9c-81dab3542377 req-7bdf0993-5adc-402e-a1f4-c6661d8a03a4 service nova] Acquiring lock "9164203a-8a6b-4078-bd98-c5ea7bc111fa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:02:19 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-a9f3a002-5cec-4426-bf9c-81dab3542377 req-7bdf0993-5adc-402e-a1f4-c6661d8a03a4 service nova] Lock "9164203a-8a6b-4078-bd98-c5ea7bc111fa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:02:19 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-a9f3a002-5cec-4426-bf9c-81dab3542377 req-7bdf0993-5adc-402e-a1f4-c6661d8a03a4 service nova] Lock "9164203a-8a6b-4078-bd98-c5ea7bc111fa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:02:19 user nova-compute[71474]: DEBUG nova.compute.manager [req-a9f3a002-5cec-4426-bf9c-81dab3542377 req-7bdf0993-5adc-402e-a1f4-c6661d8a03a4 service nova] [instance: 9164203a-8a6b-4078-bd98-c5ea7bc111fa] No waiting events found dispatching network-vif-unplugged-70605424-311e-401f-b769-3e037210f46a {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:02:19 user nova-compute[71474]: DEBUG nova.compute.manager [req-a9f3a002-5cec-4426-bf9c-81dab3542377 req-7bdf0993-5adc-402e-a1f4-c6661d8a03a4 service nova] [instance: 9164203a-8a6b-4078-bd98-c5ea7bc111fa] Received event network-vif-unplugged-70605424-311e-401f-b769-3e037210f46a for instance with task_state deleting. {{(pid=71474) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 21 14:02:20 user nova-compute[71474]: INFO nova.virt.libvirt.driver [-] [instance: 9164203a-8a6b-4078-bd98-c5ea7bc111fa] Instance destroyed successfully. Apr 21 14:02:20 user nova-compute[71474]: DEBUG nova.objects.instance [None req-9994a6f9-f30b-499b-95f1-154db1e5ef36 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Lazy-loading 'resources' on Instance uuid 9164203a-8a6b-4078-bd98-c5ea7bc111fa {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 14:02:20 user nova-compute[71474]: DEBUG nova.virt.libvirt.vif [None req-9994a6f9-f30b-499b-95f1-154db1e5ef36 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T14:00:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-VolumesAdminNegativeTest-server-570817423',display_name='tempest-VolumesAdminNegativeTest-server-570817423',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesadminnegativetest-server-570817423',id=11,image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-21T14:00:33Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='15f83d6d2c3049e9ba1ac7f04ad2ebb0',ramdisk_id='',reservation_id='r-6lwxzn0d',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-VolumesAdminNegativeTest-1182596808',owner_user_name='tempest-VolumesAdminNegativeTest-1182596808-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-21T14:00:34Z,user_data=None,user_id='b60caf53ee58417cb76a77c963a45ec2',uuid=9164203a-8a6b-4078-bd98-c5ea7bc111fa,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "70605424-311e-401f-b769-3e037210f46a", "address": "fa:16:3e:b4:50:60", "network": {"id": "6e372a6f-6444-4977-be86-7a6bb86d8979", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-2058149994-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "15f83d6d2c3049e9ba1ac7f04ad2ebb0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap70605424-31", "ovs_interfaceid": "70605424-311e-401f-b769-3e037210f46a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 21 14:02:20 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-9994a6f9-f30b-499b-95f1-154db1e5ef36 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Converting VIF {"id": "70605424-311e-401f-b769-3e037210f46a", "address": "fa:16:3e:b4:50:60", "network": {"id": "6e372a6f-6444-4977-be86-7a6bb86d8979", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-2058149994-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "15f83d6d2c3049e9ba1ac7f04ad2ebb0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap70605424-31", "ovs_interfaceid": "70605424-311e-401f-b769-3e037210f46a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 14:02:20 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-9994a6f9-f30b-499b-95f1-154db1e5ef36 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b4:50:60,bridge_name='br-int',has_traffic_filtering=True,id=70605424-311e-401f-b769-3e037210f46a,network=Network(6e372a6f-6444-4977-be86-7a6bb86d8979),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap70605424-31') {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 14:02:20 user nova-compute[71474]: DEBUG os_vif [None req-9994a6f9-f30b-499b-95f1-154db1e5ef36 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b4:50:60,bridge_name='br-int',has_traffic_filtering=True,id=70605424-311e-401f-b769-3e037210f46a,network=Network(6e372a6f-6444-4977-be86-7a6bb86d8979),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap70605424-31') {{(pid=71474) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 21 14:02:20 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:02:20 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap70605424-31, bridge=br-int, if_exists=True) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 14:02:20 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:02:20 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 14:02:20 user nova-compute[71474]: INFO os_vif [None req-9994a6f9-f30b-499b-95f1-154db1e5ef36 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b4:50:60,bridge_name='br-int',has_traffic_filtering=True,id=70605424-311e-401f-b769-3e037210f46a,network=Network(6e372a6f-6444-4977-be86-7a6bb86d8979),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap70605424-31') Apr 21 14:02:20 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-9994a6f9-f30b-499b-95f1-154db1e5ef36 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] [instance: 9164203a-8a6b-4078-bd98-c5ea7bc111fa] Deleting instance files /opt/stack/data/nova/instances/9164203a-8a6b-4078-bd98-c5ea7bc111fa_del Apr 21 14:02:20 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-9994a6f9-f30b-499b-95f1-154db1e5ef36 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] [instance: 9164203a-8a6b-4078-bd98-c5ea7bc111fa] Deletion of /opt/stack/data/nova/instances/9164203a-8a6b-4078-bd98-c5ea7bc111fa_del complete Apr 21 14:02:20 user nova-compute[71474]: INFO nova.compute.manager [None req-9994a6f9-f30b-499b-95f1-154db1e5ef36 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] [instance: 9164203a-8a6b-4078-bd98-c5ea7bc111fa] Took 0.68 seconds to destroy the instance on the hypervisor. Apr 21 14:02:20 user nova-compute[71474]: DEBUG oslo.service.loopingcall [None req-9994a6f9-f30b-499b-95f1-154db1e5ef36 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71474) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 21 14:02:20 user nova-compute[71474]: DEBUG nova.compute.manager [-] [instance: 9164203a-8a6b-4078-bd98-c5ea7bc111fa] Deallocating network for instance {{(pid=71474) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 21 14:02:20 user nova-compute[71474]: DEBUG nova.network.neutron [-] [instance: 9164203a-8a6b-4078-bd98-c5ea7bc111fa] deallocate_for_instance() {{(pid=71474) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 21 14:02:20 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:02:20 user nova-compute[71474]: DEBUG nova.network.neutron [-] [instance: 9164203a-8a6b-4078-bd98-c5ea7bc111fa] Updating instance_info_cache with network_info: [] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:02:20 user nova-compute[71474]: INFO nova.compute.manager [-] [instance: 9164203a-8a6b-4078-bd98-c5ea7bc111fa] Took 0.69 seconds to deallocate network for instance. Apr 21 14:02:20 user nova-compute[71474]: DEBUG nova.compute.manager [req-2b40144b-585f-4505-9c99-a494a08f1ff5 req-60c4c111-0cfa-49fa-b3ee-c30a00862de6 service nova] [instance: 9164203a-8a6b-4078-bd98-c5ea7bc111fa] Received event network-vif-deleted-70605424-311e-401f-b769-3e037210f46a {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:02:20 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9994a6f9-f30b-499b-95f1-154db1e5ef36 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:02:20 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9994a6f9-f30b-499b-95f1-154db1e5ef36 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:02:20 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:02:21 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-9994a6f9-f30b-499b-95f1-154db1e5ef36 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Inventory has not changed in ProviderTree for provider: 4e62c1ab-67bb-43ed-8389-61deb50e98d7 {{(pid=71474) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 14:02:21 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-9994a6f9-f30b-499b-95f1-154db1e5ef36 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Inventory has not changed for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 14:02:21 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9994a6f9-f30b-499b-95f1-154db1e5ef36 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.248s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:02:21 user nova-compute[71474]: INFO nova.scheduler.client.report [None req-9994a6f9-f30b-499b-95f1-154db1e5ef36 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Deleted allocations for instance 9164203a-8a6b-4078-bd98-c5ea7bc111fa Apr 21 14:02:21 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9994a6f9-f30b-499b-95f1-154db1e5ef36 tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Lock "9164203a-8a6b-4078-bd98-c5ea7bc111fa" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.810s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:02:22 user nova-compute[71474]: DEBUG nova.compute.manager [req-df78cb93-bdcd-48be-b1c8-ca7f2e823ad8 req-af20bb93-ca20-4abc-8da7-e47e2db239b9 service nova] [instance: 9164203a-8a6b-4078-bd98-c5ea7bc111fa] Received event network-vif-plugged-70605424-311e-401f-b769-3e037210f46a {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:02:22 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-df78cb93-bdcd-48be-b1c8-ca7f2e823ad8 req-af20bb93-ca20-4abc-8da7-e47e2db239b9 service nova] Acquiring lock "9164203a-8a6b-4078-bd98-c5ea7bc111fa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:02:22 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-df78cb93-bdcd-48be-b1c8-ca7f2e823ad8 req-af20bb93-ca20-4abc-8da7-e47e2db239b9 service nova] Lock "9164203a-8a6b-4078-bd98-c5ea7bc111fa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:02:22 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-df78cb93-bdcd-48be-b1c8-ca7f2e823ad8 req-af20bb93-ca20-4abc-8da7-e47e2db239b9 service nova] Lock "9164203a-8a6b-4078-bd98-c5ea7bc111fa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:02:22 user nova-compute[71474]: DEBUG nova.compute.manager [req-df78cb93-bdcd-48be-b1c8-ca7f2e823ad8 req-af20bb93-ca20-4abc-8da7-e47e2db239b9 service nova] [instance: 9164203a-8a6b-4078-bd98-c5ea7bc111fa] No waiting events found dispatching network-vif-plugged-70605424-311e-401f-b769-3e037210f46a {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:02:22 user nova-compute[71474]: WARNING nova.compute.manager [req-df78cb93-bdcd-48be-b1c8-ca7f2e823ad8 req-af20bb93-ca20-4abc-8da7-e47e2db239b9 service nova] [instance: 9164203a-8a6b-4078-bd98-c5ea7bc111fa] Received unexpected event network-vif-plugged-70605424-311e-401f-b769-3e037210f46a for instance with vm_state deleted and task_state None. Apr 21 14:02:25 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:02:30 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:02:35 user nova-compute[71474]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71474) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 14:02:35 user nova-compute[71474]: INFO nova.compute.manager [-] [instance: 9164203a-8a6b-4078-bd98-c5ea7bc111fa] VM Stopped (Lifecycle Event) Apr 21 14:02:35 user nova-compute[71474]: DEBUG nova.compute.manager [None req-4250e9f7-c7fb-49e3-9772-f0b47d107ade None None] [instance: 9164203a-8a6b-4078-bd98-c5ea7bc111fa] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 14:02:35 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:02:40 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:02:42 user nova-compute[71474]: DEBUG nova.compute.manager [req-0619b44a-4c5d-4e18-bd83-7bcd8035ede2 req-f7e36672-f359-4a61-a0a2-2e7e33ced712 service nova] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] Received event network-changed-662f0568-767d-4dd9-b220-2936b4d96745 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:02:42 user nova-compute[71474]: DEBUG nova.compute.manager [req-0619b44a-4c5d-4e18-bd83-7bcd8035ede2 req-f7e36672-f359-4a61-a0a2-2e7e33ced712 service nova] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] Refreshing instance network info cache due to event network-changed-662f0568-767d-4dd9-b220-2936b4d96745. {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 21 14:02:42 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-0619b44a-4c5d-4e18-bd83-7bcd8035ede2 req-f7e36672-f359-4a61-a0a2-2e7e33ced712 service nova] Acquiring lock "refresh_cache-ef0a7b15-eab4-4705-9f70-9c9117736eb1" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 14:02:42 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-0619b44a-4c5d-4e18-bd83-7bcd8035ede2 req-f7e36672-f359-4a61-a0a2-2e7e33ced712 service nova] Acquired lock "refresh_cache-ef0a7b15-eab4-4705-9f70-9c9117736eb1" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 14:02:42 user nova-compute[71474]: DEBUG nova.network.neutron [req-0619b44a-4c5d-4e18-bd83-7bcd8035ede2 req-f7e36672-f359-4a61-a0a2-2e7e33ced712 service nova] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] Refreshing network info cache for port 662f0568-767d-4dd9-b220-2936b4d96745 {{(pid=71474) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 21 14:02:43 user nova-compute[71474]: DEBUG nova.network.neutron [req-0619b44a-4c5d-4e18-bd83-7bcd8035ede2 req-f7e36672-f359-4a61-a0a2-2e7e33ced712 service nova] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] Updated VIF entry in instance network info cache for port 662f0568-767d-4dd9-b220-2936b4d96745. {{(pid=71474) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 21 14:02:43 user nova-compute[71474]: DEBUG nova.network.neutron [req-0619b44a-4c5d-4e18-bd83-7bcd8035ede2 req-f7e36672-f359-4a61-a0a2-2e7e33ced712 service nova] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] Updating instance_info_cache with network_info: [{"id": "662f0568-767d-4dd9-b220-2936b4d96745", "address": "fa:16:3e:bf:9c:e9", "network": {"id": "23a0f330-371d-4fe5-befe-bc4147bf09c7", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-656541543-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.163", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "91f5972380fd48eabffd46e6727239ce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap662f0568-76", "ovs_interfaceid": "662f0568-767d-4dd9-b220-2936b4d96745", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:02:43 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-0619b44a-4c5d-4e18-bd83-7bcd8035ede2 req-f7e36672-f359-4a61-a0a2-2e7e33ced712 service nova] Releasing lock "refresh_cache-ef0a7b15-eab4-4705-9f70-9c9117736eb1" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 14:02:44 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-b32b176d-463a-4ddb-a1ff-04111ad59b3d tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Acquiring lock "ef0a7b15-eab4-4705-9f70-9c9117736eb1" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:02:44 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-b32b176d-463a-4ddb-a1ff-04111ad59b3d tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Lock "ef0a7b15-eab4-4705-9f70-9c9117736eb1" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:02:44 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-b32b176d-463a-4ddb-a1ff-04111ad59b3d tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Acquiring lock "ef0a7b15-eab4-4705-9f70-9c9117736eb1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:02:44 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-b32b176d-463a-4ddb-a1ff-04111ad59b3d tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Lock "ef0a7b15-eab4-4705-9f70-9c9117736eb1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:02:44 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-b32b176d-463a-4ddb-a1ff-04111ad59b3d tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Lock "ef0a7b15-eab4-4705-9f70-9c9117736eb1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:02:44 user nova-compute[71474]: INFO nova.compute.manager [None req-b32b176d-463a-4ddb-a1ff-04111ad59b3d tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] Terminating instance Apr 21 14:02:44 user nova-compute[71474]: DEBUG nova.compute.manager [None req-b32b176d-463a-4ddb-a1ff-04111ad59b3d tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] Start destroying the instance on the hypervisor. {{(pid=71474) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 21 14:02:44 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:02:44 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:02:44 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:02:44 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:02:44 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:02:44 user nova-compute[71474]: DEBUG nova.compute.manager [req-66cddcd2-c4b6-407b-ad13-939cf2fb3538 req-98e9b68d-8ff8-4c71-b778-305f58548656 service nova] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] Received event network-vif-unplugged-662f0568-767d-4dd9-b220-2936b4d96745 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:02:44 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-66cddcd2-c4b6-407b-ad13-939cf2fb3538 req-98e9b68d-8ff8-4c71-b778-305f58548656 service nova] Acquiring lock "ef0a7b15-eab4-4705-9f70-9c9117736eb1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:02:44 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-66cddcd2-c4b6-407b-ad13-939cf2fb3538 req-98e9b68d-8ff8-4c71-b778-305f58548656 service nova] Lock "ef0a7b15-eab4-4705-9f70-9c9117736eb1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:02:44 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-66cddcd2-c4b6-407b-ad13-939cf2fb3538 req-98e9b68d-8ff8-4c71-b778-305f58548656 service nova] Lock "ef0a7b15-eab4-4705-9f70-9c9117736eb1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:02:44 user nova-compute[71474]: DEBUG nova.compute.manager [req-66cddcd2-c4b6-407b-ad13-939cf2fb3538 req-98e9b68d-8ff8-4c71-b778-305f58548656 service nova] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] No waiting events found dispatching network-vif-unplugged-662f0568-767d-4dd9-b220-2936b4d96745 {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:02:44 user nova-compute[71474]: DEBUG nova.compute.manager [req-66cddcd2-c4b6-407b-ad13-939cf2fb3538 req-98e9b68d-8ff8-4c71-b778-305f58548656 service nova] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] Received event network-vif-unplugged-662f0568-767d-4dd9-b220-2936b4d96745 for instance with task_state deleting. {{(pid=71474) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 21 14:02:44 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:02:44 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:02:45 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:02:45 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:02:45 user nova-compute[71474]: INFO nova.virt.libvirt.driver [-] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] Instance destroyed successfully. Apr 21 14:02:45 user nova-compute[71474]: DEBUG nova.objects.instance [None req-b32b176d-463a-4ddb-a1ff-04111ad59b3d tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Lazy-loading 'resources' on Instance uuid ef0a7b15-eab4-4705-9f70-9c9117736eb1 {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 14:02:45 user nova-compute[71474]: DEBUG nova.virt.libvirt.vif [None req-b32b176d-463a-4ddb-a1ff-04111ad59b3d tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T14:00:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-2136406868',display_name='tempest-AttachVolumeTestJSON-server-2136406868',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumetestjson-server-2136406868',id=12,image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNYfx0D/9HI6dRD4wXlpAeizOIf9VmC7glu2drchWSBPGsmZxukl0JKQLSPBGvOnDZea9iBw8HpwJNLK6oFXfEHEkUs1WkQz/KQVrF/Jrc/AnOokiNeEKYxBPPCAEmxZ0Q==',key_name='tempest-keypair-1493985608',keypairs=,launch_index=0,launched_at=2023-04-21T14:00:58Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='91f5972380fd48eabffd46e6727239ce',ramdisk_id='',reservation_id='r-fvg0azuk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-AttachVolumeTestJSON-1194238008',owner_user_name='tempest-AttachVolumeTestJSON-1194238008-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-21T14:00:59Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='113884844de14ec7ac8a20ba06a389b3',uuid=ef0a7b15-eab4-4705-9f70-9c9117736eb1,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "662f0568-767d-4dd9-b220-2936b4d96745", "address": "fa:16:3e:bf:9c:e9", "network": {"id": "23a0f330-371d-4fe5-befe-bc4147bf09c7", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-656541543-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.163", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "91f5972380fd48eabffd46e6727239ce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap662f0568-76", "ovs_interfaceid": "662f0568-767d-4dd9-b220-2936b4d96745", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 21 14:02:45 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-b32b176d-463a-4ddb-a1ff-04111ad59b3d tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Converting VIF {"id": "662f0568-767d-4dd9-b220-2936b4d96745", "address": "fa:16:3e:bf:9c:e9", "network": {"id": "23a0f330-371d-4fe5-befe-bc4147bf09c7", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-656541543-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.163", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "91f5972380fd48eabffd46e6727239ce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap662f0568-76", "ovs_interfaceid": "662f0568-767d-4dd9-b220-2936b4d96745", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 14:02:45 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-b32b176d-463a-4ddb-a1ff-04111ad59b3d tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:bf:9c:e9,bridge_name='br-int',has_traffic_filtering=True,id=662f0568-767d-4dd9-b220-2936b4d96745,network=Network(23a0f330-371d-4fe5-befe-bc4147bf09c7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap662f0568-76') {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 14:02:45 user nova-compute[71474]: DEBUG os_vif [None req-b32b176d-463a-4ddb-a1ff-04111ad59b3d tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:bf:9c:e9,bridge_name='br-int',has_traffic_filtering=True,id=662f0568-767d-4dd9-b220-2936b4d96745,network=Network(23a0f330-371d-4fe5-befe-bc4147bf09c7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap662f0568-76') {{(pid=71474) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 21 14:02:45 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:02:45 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap662f0568-76, bridge=br-int, if_exists=True) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 14:02:45 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 14:02:45 user nova-compute[71474]: INFO os_vif [None req-b32b176d-463a-4ddb-a1ff-04111ad59b3d tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:bf:9c:e9,bridge_name='br-int',has_traffic_filtering=True,id=662f0568-767d-4dd9-b220-2936b4d96745,network=Network(23a0f330-371d-4fe5-befe-bc4147bf09c7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap662f0568-76') Apr 21 14:02:45 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-b32b176d-463a-4ddb-a1ff-04111ad59b3d tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] Deleting instance files /opt/stack/data/nova/instances/ef0a7b15-eab4-4705-9f70-9c9117736eb1_del Apr 21 14:02:45 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-b32b176d-463a-4ddb-a1ff-04111ad59b3d tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] Deletion of /opt/stack/data/nova/instances/ef0a7b15-eab4-4705-9f70-9c9117736eb1_del complete Apr 21 14:02:45 user nova-compute[71474]: INFO nova.compute.manager [None req-b32b176d-463a-4ddb-a1ff-04111ad59b3d tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] Took 0.93 seconds to destroy the instance on the hypervisor. Apr 21 14:02:45 user nova-compute[71474]: DEBUG oslo.service.loopingcall [None req-b32b176d-463a-4ddb-a1ff-04111ad59b3d tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71474) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 21 14:02:45 user nova-compute[71474]: DEBUG nova.compute.manager [-] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] Deallocating network for instance {{(pid=71474) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 21 14:02:45 user nova-compute[71474]: DEBUG nova.network.neutron [-] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] deallocate_for_instance() {{(pid=71474) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 21 14:02:45 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:02:45 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:02:45 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:02:45 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:02:45 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Acquiring lock "4a44d9f3-28b2-45e7-b952-2bb1735ef5b5" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:02:45 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Lock "4a44d9f3-28b2-45e7-b952-2bb1735ef5b5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:02:45 user nova-compute[71474]: DEBUG nova.compute.manager [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] Starting instance... {{(pid=71474) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 21 14:02:45 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:02:45 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:02:45 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:02:45 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71474) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 21 14:02:45 user nova-compute[71474]: INFO nova.compute.claims [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] Claim successful on node user Apr 21 14:02:46 user nova-compute[71474]: DEBUG nova.network.neutron [-] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] Updating instance_info_cache with network_info: [] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:02:46 user nova-compute[71474]: INFO nova.compute.manager [-] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] Took 0.94 seconds to deallocate network for instance. Apr 21 14:02:46 user nova-compute[71474]: DEBUG nova.compute.manager [req-545c4e74-41c8-4ca0-a18f-ea57d8b8b0a8 req-acfe9563-ddea-478a-82d9-4f4f3d9d0585 service nova] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] Received event network-vif-deleted-662f0568-767d-4dd9-b220-2936b4d96745 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:02:46 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-b32b176d-463a-4ddb-a1ff-04111ad59b3d tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:02:46 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Inventory has not changed in ProviderTree for provider: 4e62c1ab-67bb-43ed-8389-61deb50e98d7 {{(pid=71474) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 14:02:46 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Inventory has not changed for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 14:02:46 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.304s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:02:46 user nova-compute[71474]: DEBUG nova.compute.manager [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] Start building networks asynchronously for instance. {{(pid=71474) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 21 14:02:46 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-b32b176d-463a-4ddb-a1ff-04111ad59b3d tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.069s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:02:46 user nova-compute[71474]: DEBUG nova.compute.manager [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] Allocating IP information in the background. {{(pid=71474) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 21 14:02:46 user nova-compute[71474]: DEBUG nova.network.neutron [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] allocate_for_instance() {{(pid=71474) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 21 14:02:46 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 21 14:02:46 user nova-compute[71474]: DEBUG nova.compute.manager [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] Start building block device mappings for instance. {{(pid=71474) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 21 14:02:46 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-b32b176d-463a-4ddb-a1ff-04111ad59b3d tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Inventory has not changed in ProviderTree for provider: 4e62c1ab-67bb-43ed-8389-61deb50e98d7 {{(pid=71474) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 14:02:46 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-b32b176d-463a-4ddb-a1ff-04111ad59b3d tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Inventory has not changed for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 14:02:46 user nova-compute[71474]: DEBUG nova.compute.manager [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] Start spawning the instance on the hypervisor. {{(pid=71474) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 21 14:02:46 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] Creating instance directory {{(pid=71474) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 21 14:02:46 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] Creating image(s) Apr 21 14:02:46 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Acquiring lock "/opt/stack/data/nova/instances/4a44d9f3-28b2-45e7-b952-2bb1735ef5b5/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:02:46 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Lock "/opt/stack/data/nova/instances/4a44d9f3-28b2-45e7-b952-2bb1735ef5b5/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:02:46 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Lock "/opt/stack/data/nova/instances/4a44d9f3-28b2-45e7-b952-2bb1735ef5b5/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:02:46 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:02:46 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-b32b176d-463a-4ddb-a1ff-04111ad59b3d tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.246s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:02:46 user nova-compute[71474]: DEBUG nova.policy [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '132913991f8c45c1adaf5db7ef7cea30', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '885cdc1521a14985bfa70ae21e73c693', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71474) authorize /opt/stack/nova/nova/policy.py:203}} Apr 21 14:02:46 user nova-compute[71474]: INFO nova.scheduler.client.report [None req-b32b176d-463a-4ddb-a1ff-04111ad59b3d tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Deleted allocations for instance ef0a7b15-eab4-4705-9f70-9c9117736eb1 Apr 21 14:02:46 user nova-compute[71474]: DEBUG nova.compute.manager [req-78317bd5-9d3d-496e-ac13-78d364ba4b8e req-01e27d57-3862-40cf-8820-7c416830b2e6 service nova] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] Received event network-vif-plugged-662f0568-767d-4dd9-b220-2936b4d96745 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:02:46 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-78317bd5-9d3d-496e-ac13-78d364ba4b8e req-01e27d57-3862-40cf-8820-7c416830b2e6 service nova] Acquiring lock "ef0a7b15-eab4-4705-9f70-9c9117736eb1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:02:46 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-78317bd5-9d3d-496e-ac13-78d364ba4b8e req-01e27d57-3862-40cf-8820-7c416830b2e6 service nova] Lock "ef0a7b15-eab4-4705-9f70-9c9117736eb1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:02:46 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-78317bd5-9d3d-496e-ac13-78d364ba4b8e req-01e27d57-3862-40cf-8820-7c416830b2e6 service nova] Lock "ef0a7b15-eab4-4705-9f70-9c9117736eb1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:02:46 user nova-compute[71474]: DEBUG nova.compute.manager [req-78317bd5-9d3d-496e-ac13-78d364ba4b8e req-01e27d57-3862-40cf-8820-7c416830b2e6 service nova] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] No waiting events found dispatching network-vif-plugged-662f0568-767d-4dd9-b220-2936b4d96745 {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:02:46 user nova-compute[71474]: WARNING nova.compute.manager [req-78317bd5-9d3d-496e-ac13-78d364ba4b8e req-01e27d57-3862-40cf-8820-7c416830b2e6 service nova] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] Received unexpected event network-vif-plugged-662f0568-767d-4dd9-b220-2936b4d96745 for instance with vm_state deleted and task_state None. Apr 21 14:02:46 user nova-compute[71474]: DEBUG nova.compute.manager [req-78317bd5-9d3d-496e-ac13-78d364ba4b8e req-01e27d57-3862-40cf-8820-7c416830b2e6 service nova] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] Received event network-vif-plugged-662f0568-767d-4dd9-b220-2936b4d96745 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:02:46 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-78317bd5-9d3d-496e-ac13-78d364ba4b8e req-01e27d57-3862-40cf-8820-7c416830b2e6 service nova] Acquiring lock "ef0a7b15-eab4-4705-9f70-9c9117736eb1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:02:46 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-78317bd5-9d3d-496e-ac13-78d364ba4b8e req-01e27d57-3862-40cf-8820-7c416830b2e6 service nova] Lock "ef0a7b15-eab4-4705-9f70-9c9117736eb1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:02:46 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-78317bd5-9d3d-496e-ac13-78d364ba4b8e req-01e27d57-3862-40cf-8820-7c416830b2e6 service nova] Lock "ef0a7b15-eab4-4705-9f70-9c9117736eb1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:02:46 user nova-compute[71474]: DEBUG nova.compute.manager [req-78317bd5-9d3d-496e-ac13-78d364ba4b8e req-01e27d57-3862-40cf-8820-7c416830b2e6 service nova] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] No waiting events found dispatching network-vif-plugged-662f0568-767d-4dd9-b220-2936b4d96745 {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:02:46 user nova-compute[71474]: WARNING nova.compute.manager [req-78317bd5-9d3d-496e-ac13-78d364ba4b8e req-01e27d57-3862-40cf-8820-7c416830b2e6 service nova] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] Received unexpected event network-vif-plugged-662f0568-767d-4dd9-b220-2936b4d96745 for instance with vm_state deleted and task_state None. Apr 21 14:02:46 user nova-compute[71474]: DEBUG nova.compute.manager [req-78317bd5-9d3d-496e-ac13-78d364ba4b8e req-01e27d57-3862-40cf-8820-7c416830b2e6 service nova] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] Received event network-vif-plugged-662f0568-767d-4dd9-b220-2936b4d96745 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:02:46 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-78317bd5-9d3d-496e-ac13-78d364ba4b8e req-01e27d57-3862-40cf-8820-7c416830b2e6 service nova] Acquiring lock "ef0a7b15-eab4-4705-9f70-9c9117736eb1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:02:46 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-78317bd5-9d3d-496e-ac13-78d364ba4b8e req-01e27d57-3862-40cf-8820-7c416830b2e6 service nova] Lock "ef0a7b15-eab4-4705-9f70-9c9117736eb1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:02:46 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-78317bd5-9d3d-496e-ac13-78d364ba4b8e req-01e27d57-3862-40cf-8820-7c416830b2e6 service nova] Lock "ef0a7b15-eab4-4705-9f70-9c9117736eb1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:02:46 user nova-compute[71474]: DEBUG nova.compute.manager [req-78317bd5-9d3d-496e-ac13-78d364ba4b8e req-01e27d57-3862-40cf-8820-7c416830b2e6 service nova] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] No waiting events found dispatching network-vif-plugged-662f0568-767d-4dd9-b220-2936b4d96745 {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:02:46 user nova-compute[71474]: WARNING nova.compute.manager [req-78317bd5-9d3d-496e-ac13-78d364ba4b8e req-01e27d57-3862-40cf-8820-7c416830b2e6 service nova] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] Received unexpected event network-vif-plugged-662f0568-767d-4dd9-b220-2936b4d96745 for instance with vm_state deleted and task_state None. Apr 21 14:02:46 user nova-compute[71474]: DEBUG nova.compute.manager [req-78317bd5-9d3d-496e-ac13-78d364ba4b8e req-01e27d57-3862-40cf-8820-7c416830b2e6 service nova] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] Received event network-vif-unplugged-662f0568-767d-4dd9-b220-2936b4d96745 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:02:46 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-78317bd5-9d3d-496e-ac13-78d364ba4b8e req-01e27d57-3862-40cf-8820-7c416830b2e6 service nova] Acquiring lock "ef0a7b15-eab4-4705-9f70-9c9117736eb1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:02:46 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-78317bd5-9d3d-496e-ac13-78d364ba4b8e req-01e27d57-3862-40cf-8820-7c416830b2e6 service nova] Lock "ef0a7b15-eab4-4705-9f70-9c9117736eb1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:02:46 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-78317bd5-9d3d-496e-ac13-78d364ba4b8e req-01e27d57-3862-40cf-8820-7c416830b2e6 service nova] Lock "ef0a7b15-eab4-4705-9f70-9c9117736eb1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:02:46 user nova-compute[71474]: DEBUG nova.compute.manager [req-78317bd5-9d3d-496e-ac13-78d364ba4b8e req-01e27d57-3862-40cf-8820-7c416830b2e6 service nova] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] No waiting events found dispatching network-vif-unplugged-662f0568-767d-4dd9-b220-2936b4d96745 {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:02:46 user nova-compute[71474]: WARNING nova.compute.manager [req-78317bd5-9d3d-496e-ac13-78d364ba4b8e req-01e27d57-3862-40cf-8820-7c416830b2e6 service nova] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] Received unexpected event network-vif-unplugged-662f0568-767d-4dd9-b220-2936b4d96745 for instance with vm_state deleted and task_state None. Apr 21 14:02:46 user nova-compute[71474]: DEBUG nova.compute.manager [req-78317bd5-9d3d-496e-ac13-78d364ba4b8e req-01e27d57-3862-40cf-8820-7c416830b2e6 service nova] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] Received event network-vif-plugged-662f0568-767d-4dd9-b220-2936b4d96745 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:02:46 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-78317bd5-9d3d-496e-ac13-78d364ba4b8e req-01e27d57-3862-40cf-8820-7c416830b2e6 service nova] Acquiring lock "ef0a7b15-eab4-4705-9f70-9c9117736eb1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:02:46 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-78317bd5-9d3d-496e-ac13-78d364ba4b8e req-01e27d57-3862-40cf-8820-7c416830b2e6 service nova] Lock "ef0a7b15-eab4-4705-9f70-9c9117736eb1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:02:46 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-78317bd5-9d3d-496e-ac13-78d364ba4b8e req-01e27d57-3862-40cf-8820-7c416830b2e6 service nova] Lock "ef0a7b15-eab4-4705-9f70-9c9117736eb1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:02:46 user nova-compute[71474]: DEBUG nova.compute.manager [req-78317bd5-9d3d-496e-ac13-78d364ba4b8e req-01e27d57-3862-40cf-8820-7c416830b2e6 service nova] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] No waiting events found dispatching network-vif-plugged-662f0568-767d-4dd9-b220-2936b4d96745 {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:02:46 user nova-compute[71474]: WARNING nova.compute.manager [req-78317bd5-9d3d-496e-ac13-78d364ba4b8e req-01e27d57-3862-40cf-8820-7c416830b2e6 service nova] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] Received unexpected event network-vif-plugged-662f0568-767d-4dd9-b220-2936b4d96745 for instance with vm_state deleted and task_state None. Apr 21 14:02:46 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json" returned: 0 in 0.152s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:02:46 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Acquiring lock "8e8c288cb98f22f6af31ad55f38b7baa81c260d7" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:02:46 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Lock "8e8c288cb98f22f6af31ad55f38b7baa81c260d7" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:02:46 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:02:46 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-b32b176d-463a-4ddb-a1ff-04111ad59b3d tempest-AttachVolumeTestJSON-1194238008 tempest-AttachVolumeTestJSON-1194238008-project-member] Lock "ef0a7b15-eab4-4705-9f70-9c9117736eb1" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.402s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:02:46 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json" returned: 0 in 0.128s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:02:46 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7,backing_fmt=raw /opt/stack/data/nova/instances/4a44d9f3-28b2-45e7-b952-2bb1735ef5b5/disk 1073741824 {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:02:46 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7,backing_fmt=raw /opt/stack/data/nova/instances/4a44d9f3-28b2-45e7-b952-2bb1735ef5b5/disk 1073741824" returned: 0 in 0.047s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:02:46 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Lock "8e8c288cb98f22f6af31ad55f38b7baa81c260d7" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.180s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:02:46 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:02:46 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json" returned: 0 in 0.151s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:02:46 user nova-compute[71474]: DEBUG nova.virt.disk.api [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Checking if we can resize image /opt/stack/data/nova/instances/4a44d9f3-28b2-45e7-b952-2bb1735ef5b5/disk. size=1073741824 {{(pid=71474) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 21 14:02:46 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4a44d9f3-28b2-45e7-b952-2bb1735ef5b5/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:02:47 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4a44d9f3-28b2-45e7-b952-2bb1735ef5b5/disk --force-share --output=json" returned: 0 in 0.132s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:02:47 user nova-compute[71474]: DEBUG nova.virt.disk.api [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Cannot resize image /opt/stack/data/nova/instances/4a44d9f3-28b2-45e7-b952-2bb1735ef5b5/disk to a smaller size. {{(pid=71474) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 21 14:02:47 user nova-compute[71474]: DEBUG nova.objects.instance [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Lazy-loading 'migration_context' on Instance uuid 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5 {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 14:02:47 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] Created local disks {{(pid=71474) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 21 14:02:47 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] Ensure instance console log exists: /opt/stack/data/nova/instances/4a44d9f3-28b2-45e7-b952-2bb1735ef5b5/console.log {{(pid=71474) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 21 14:02:47 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:02:47 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:02:47 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:02:47 user nova-compute[71474]: DEBUG nova.network.neutron [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] Successfully created port: 5c5a52bd-c710-4614-8353-55be240cfa17 {{(pid=71474) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 21 14:02:47 user nova-compute[71474]: DEBUG nova.network.neutron [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] Successfully updated port: 5c5a52bd-c710-4614-8353-55be240cfa17 {{(pid=71474) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 21 14:02:47 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Acquiring lock "refresh_cache-4a44d9f3-28b2-45e7-b952-2bb1735ef5b5" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 14:02:47 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Acquired lock "refresh_cache-4a44d9f3-28b2-45e7-b952-2bb1735ef5b5" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 14:02:47 user nova-compute[71474]: DEBUG nova.network.neutron [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] Building network info cache for instance {{(pid=71474) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 21 14:02:48 user nova-compute[71474]: DEBUG nova.network.neutron [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] Instance cache missing network info. {{(pid=71474) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 21 14:02:48 user nova-compute[71474]: DEBUG nova.compute.manager [req-be88b8ef-426d-48ff-a2f3-01ff1a3f69b5 req-e0a7df01-ea5b-4cda-9b5c-e73056bbca01 service nova] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] Received event network-changed-5c5a52bd-c710-4614-8353-55be240cfa17 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:02:48 user nova-compute[71474]: DEBUG nova.compute.manager [req-be88b8ef-426d-48ff-a2f3-01ff1a3f69b5 req-e0a7df01-ea5b-4cda-9b5c-e73056bbca01 service nova] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] Refreshing instance network info cache due to event network-changed-5c5a52bd-c710-4614-8353-55be240cfa17. {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 21 14:02:48 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-be88b8ef-426d-48ff-a2f3-01ff1a3f69b5 req-e0a7df01-ea5b-4cda-9b5c-e73056bbca01 service nova] Acquiring lock "refresh_cache-4a44d9f3-28b2-45e7-b952-2bb1735ef5b5" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 14:02:48 user nova-compute[71474]: DEBUG nova.network.neutron [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] Updating instance_info_cache with network_info: [{"id": "5c5a52bd-c710-4614-8353-55be240cfa17", "address": "fa:16:3e:66:41:1d", "network": {"id": "4b38afb7-2b53-44fc-a4e0-7d79bef71734", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-935140606-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "885cdc1521a14985bfa70ae21e73c693", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c5a52bd-c7", "ovs_interfaceid": "5c5a52bd-c710-4614-8353-55be240cfa17", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:02:48 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Releasing lock "refresh_cache-4a44d9f3-28b2-45e7-b952-2bb1735ef5b5" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 14:02:48 user nova-compute[71474]: DEBUG nova.compute.manager [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] Instance network_info: |[{"id": "5c5a52bd-c710-4614-8353-55be240cfa17", "address": "fa:16:3e:66:41:1d", "network": {"id": "4b38afb7-2b53-44fc-a4e0-7d79bef71734", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-935140606-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "885cdc1521a14985bfa70ae21e73c693", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c5a52bd-c7", "ovs_interfaceid": "5c5a52bd-c710-4614-8353-55be240cfa17", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71474) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 21 14:02:48 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-be88b8ef-426d-48ff-a2f3-01ff1a3f69b5 req-e0a7df01-ea5b-4cda-9b5c-e73056bbca01 service nova] Acquired lock "refresh_cache-4a44d9f3-28b2-45e7-b952-2bb1735ef5b5" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 14:02:48 user nova-compute[71474]: DEBUG nova.network.neutron [req-be88b8ef-426d-48ff-a2f3-01ff1a3f69b5 req-e0a7df01-ea5b-4cda-9b5c-e73056bbca01 service nova] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] Refreshing network info cache for port 5c5a52bd-c710-4614-8353-55be240cfa17 {{(pid=71474) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 21 14:02:48 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] Start _get_guest_xml network_info=[{"id": "5c5a52bd-c710-4614-8353-55be240cfa17", "address": "fa:16:3e:66:41:1d", "network": {"id": "4b38afb7-2b53-44fc-a4e0-7d79bef71734", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-935140606-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "885cdc1521a14985bfa70ae21e73c693", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c5a52bd-c7", "ovs_interfaceid": "5c5a52bd-c710-4614-8353-55be240cfa17", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T13:54:16Z,direct_url=,disk_format='qcow2',id=2edfef44-2867-4e03-a53e-b139f99afa75,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='36a44032fda748c1965c722304fa176d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T13:54:18Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'guest_format': None, 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'image_id': '2edfef44-2867-4e03-a53e-b139f99afa75'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71474) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 21 14:02:48 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 14:02:48 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 14:02:48 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71474) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 21 14:02:48 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-21T13:55:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T13:54:16Z,direct_url=,disk_format='qcow2',id=2edfef44-2867-4e03-a53e-b139f99afa75,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='36a44032fda748c1965c722304fa176d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T13:54:18Z,virtual_size=,visibility=), allow threads: True {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 21 14:02:48 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Flavor limits 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 21 14:02:48 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Image limits 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 21 14:02:48 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Flavor pref 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 21 14:02:48 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Image pref 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 21 14:02:48 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 21 14:02:48 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 21 14:02:48 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71474) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 21 14:02:48 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Got 1 possible topologies {{(pid=71474) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 21 14:02:48 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 21 14:02:48 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 21 14:02:48 user nova-compute[71474]: DEBUG nova.virt.libvirt.vif [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T14:02:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-694783197',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverbootfromvolumestablerescuetest-server-694783197',id=15,image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='885cdc1521a14985bfa70ae21e73c693',ramdisk_id='',reservation_id='r-nipn8ele',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-28514522',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T14:02:46Z,user_data=None,user_id='132913991f8c45c1adaf5db7ef7cea30',uuid=4a44d9f3-28b2-45e7-b952-2bb1735ef5b5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5c5a52bd-c710-4614-8353-55be240cfa17", "address": "fa:16:3e:66:41:1d", "network": {"id": "4b38afb7-2b53-44fc-a4e0-7d79bef71734", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-935140606-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "885cdc1521a14985bfa70ae21e73c693", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c5a52bd-c7", "ovs_interfaceid": "5c5a52bd-c710-4614-8353-55be240cfa17", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71474) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 21 14:02:48 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Converting VIF {"id": "5c5a52bd-c710-4614-8353-55be240cfa17", "address": "fa:16:3e:66:41:1d", "network": {"id": "4b38afb7-2b53-44fc-a4e0-7d79bef71734", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-935140606-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "885cdc1521a14985bfa70ae21e73c693", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c5a52bd-c7", "ovs_interfaceid": "5c5a52bd-c710-4614-8353-55be240cfa17", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 14:02:48 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:66:41:1d,bridge_name='br-int',has_traffic_filtering=True,id=5c5a52bd-c710-4614-8353-55be240cfa17,network=Network(4b38afb7-2b53-44fc-a4e0-7d79bef71734),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5c5a52bd-c7') {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 14:02:48 user nova-compute[71474]: DEBUG nova.objects.instance [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Lazy-loading 'pci_devices' on Instance uuid 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5 {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 14:02:48 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] End _get_guest_xml xml= Apr 21 14:02:48 user nova-compute[71474]: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5 Apr 21 14:02:48 user nova-compute[71474]: instance-0000000f Apr 21 14:02:48 user nova-compute[71474]: 131072 Apr 21 14:02:48 user nova-compute[71474]: 1 Apr 21 14:02:48 user nova-compute[71474]: Apr 21 14:02:48 user nova-compute[71474]: Apr 21 14:02:48 user nova-compute[71474]: Apr 21 14:02:48 user nova-compute[71474]: tempest-ServerBootFromVolumeStableRescueTest-server-694783197 Apr 21 14:02:48 user nova-compute[71474]: 2023-04-21 14:02:48 Apr 21 14:02:48 user nova-compute[71474]: Apr 21 14:02:48 user nova-compute[71474]: 128 Apr 21 14:02:48 user nova-compute[71474]: 1 Apr 21 14:02:48 user nova-compute[71474]: 0 Apr 21 14:02:48 user nova-compute[71474]: 0 Apr 21 14:02:48 user nova-compute[71474]: 1 Apr 21 14:02:48 user nova-compute[71474]: Apr 21 14:02:48 user nova-compute[71474]: Apr 21 14:02:48 user nova-compute[71474]: tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member Apr 21 14:02:48 user nova-compute[71474]: tempest-ServerBootFromVolumeStableRescueTest-28514522 Apr 21 14:02:48 user nova-compute[71474]: Apr 21 14:02:48 user nova-compute[71474]: Apr 21 14:02:48 user nova-compute[71474]: Apr 21 14:02:48 user nova-compute[71474]: Apr 21 14:02:48 user nova-compute[71474]: Apr 21 14:02:48 user nova-compute[71474]: Apr 21 14:02:48 user nova-compute[71474]: Apr 21 14:02:48 user nova-compute[71474]: Apr 21 14:02:48 user nova-compute[71474]: Apr 21 14:02:48 user nova-compute[71474]: Apr 21 14:02:48 user nova-compute[71474]: Apr 21 14:02:48 user nova-compute[71474]: OpenStack Foundation Apr 21 14:02:48 user nova-compute[71474]: OpenStack Nova Apr 21 14:02:48 user nova-compute[71474]: 0.0.0 Apr 21 14:02:48 user nova-compute[71474]: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5 Apr 21 14:02:48 user nova-compute[71474]: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5 Apr 21 14:02:48 user nova-compute[71474]: Virtual Machine Apr 21 14:02:48 user nova-compute[71474]: Apr 21 14:02:48 user nova-compute[71474]: Apr 21 14:02:48 user nova-compute[71474]: Apr 21 14:02:48 user nova-compute[71474]: hvm Apr 21 14:02:48 user nova-compute[71474]: Apr 21 14:02:48 user nova-compute[71474]: Apr 21 14:02:48 user nova-compute[71474]: Apr 21 14:02:48 user nova-compute[71474]: Apr 21 14:02:48 user nova-compute[71474]: Apr 21 14:02:48 user nova-compute[71474]: Apr 21 14:02:48 user nova-compute[71474]: Apr 21 14:02:48 user nova-compute[71474]: Apr 21 14:02:48 user nova-compute[71474]: Apr 21 14:02:48 user nova-compute[71474]: Apr 21 14:02:48 user nova-compute[71474]: Apr 21 14:02:48 user nova-compute[71474]: Apr 21 14:02:48 user nova-compute[71474]: Apr 21 14:02:48 user nova-compute[71474]: Apr 21 14:02:48 user nova-compute[71474]: Nehalem Apr 21 14:02:48 user nova-compute[71474]: Apr 21 14:02:48 user nova-compute[71474]: Apr 21 14:02:48 user nova-compute[71474]: Apr 21 14:02:48 user nova-compute[71474]: Apr 21 14:02:48 user nova-compute[71474]: Apr 21 14:02:48 user nova-compute[71474]: Apr 21 14:02:48 user nova-compute[71474]: Apr 21 14:02:48 user nova-compute[71474]: Apr 21 14:02:48 user nova-compute[71474]: Apr 21 14:02:48 user nova-compute[71474]: Apr 21 14:02:48 user nova-compute[71474]: Apr 21 14:02:48 user nova-compute[71474]: Apr 21 14:02:48 user nova-compute[71474]: Apr 21 14:02:48 user nova-compute[71474]: Apr 21 14:02:48 user nova-compute[71474]: Apr 21 14:02:48 user nova-compute[71474]: Apr 21 14:02:48 user nova-compute[71474]: Apr 21 14:02:48 user nova-compute[71474]: Apr 21 14:02:48 user nova-compute[71474]: Apr 21 14:02:48 user nova-compute[71474]: Apr 21 14:02:48 user nova-compute[71474]: /dev/urandom Apr 21 14:02:48 user nova-compute[71474]: Apr 21 14:02:48 user nova-compute[71474]: Apr 21 14:02:48 user nova-compute[71474]: Apr 21 14:02:48 user nova-compute[71474]: Apr 21 14:02:48 user nova-compute[71474]: Apr 21 14:02:48 user nova-compute[71474]: Apr 21 14:02:48 user nova-compute[71474]: Apr 21 14:02:48 user nova-compute[71474]: {{(pid=71474) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 21 14:02:48 user nova-compute[71474]: DEBUG nova.virt.libvirt.vif [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T14:02:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-694783197',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverbootfromvolumestablerescuetest-server-694783197',id=15,image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='885cdc1521a14985bfa70ae21e73c693',ramdisk_id='',reservation_id='r-nipn8ele',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-28514522',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T14:02:46Z,user_data=None,user_id='132913991f8c45c1adaf5db7ef7cea30',uuid=4a44d9f3-28b2-45e7-b952-2bb1735ef5b5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5c5a52bd-c710-4614-8353-55be240cfa17", "address": "fa:16:3e:66:41:1d", "network": {"id": "4b38afb7-2b53-44fc-a4e0-7d79bef71734", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-935140606-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "885cdc1521a14985bfa70ae21e73c693", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c5a52bd-c7", "ovs_interfaceid": "5c5a52bd-c710-4614-8353-55be240cfa17", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 21 14:02:48 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Converting VIF {"id": "5c5a52bd-c710-4614-8353-55be240cfa17", "address": "fa:16:3e:66:41:1d", "network": {"id": "4b38afb7-2b53-44fc-a4e0-7d79bef71734", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-935140606-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "885cdc1521a14985bfa70ae21e73c693", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c5a52bd-c7", "ovs_interfaceid": "5c5a52bd-c710-4614-8353-55be240cfa17", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 14:02:48 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:66:41:1d,bridge_name='br-int',has_traffic_filtering=True,id=5c5a52bd-c710-4614-8353-55be240cfa17,network=Network(4b38afb7-2b53-44fc-a4e0-7d79bef71734),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5c5a52bd-c7') {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 14:02:48 user nova-compute[71474]: DEBUG os_vif [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:66:41:1d,bridge_name='br-int',has_traffic_filtering=True,id=5c5a52bd-c710-4614-8353-55be240cfa17,network=Network(4b38afb7-2b53-44fc-a4e0-7d79bef71734),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5c5a52bd-c7') {{(pid=71474) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 21 14:02:48 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:02:48 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 14:02:48 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 21 14:02:48 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:02:48 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5c5a52bd-c7, may_exist=True) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 14:02:48 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5c5a52bd-c7, col_values=(('external_ids', {'iface-id': '5c5a52bd-c710-4614-8353-55be240cfa17', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:66:41:1d', 'vm-uuid': '4a44d9f3-28b2-45e7-b952-2bb1735ef5b5'}),)) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 14:02:48 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:02:48 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 14:02:48 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:02:48 user nova-compute[71474]: INFO os_vif [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:66:41:1d,bridge_name='br-int',has_traffic_filtering=True,id=5c5a52bd-c710-4614-8353-55be240cfa17,network=Network(4b38afb7-2b53-44fc-a4e0-7d79bef71734),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5c5a52bd-c7') Apr 21 14:02:48 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] No BDM found with device name vda, not building metadata. {{(pid=71474) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 21 14:02:48 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] No VIF found with MAC fa:16:3e:66:41:1d, not building metadata {{(pid=71474) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 21 14:02:49 user nova-compute[71474]: DEBUG nova.network.neutron [req-be88b8ef-426d-48ff-a2f3-01ff1a3f69b5 req-e0a7df01-ea5b-4cda-9b5c-e73056bbca01 service nova] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] Updated VIF entry in instance network info cache for port 5c5a52bd-c710-4614-8353-55be240cfa17. {{(pid=71474) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 21 14:02:49 user nova-compute[71474]: DEBUG nova.network.neutron [req-be88b8ef-426d-48ff-a2f3-01ff1a3f69b5 req-e0a7df01-ea5b-4cda-9b5c-e73056bbca01 service nova] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] Updating instance_info_cache with network_info: [{"id": "5c5a52bd-c710-4614-8353-55be240cfa17", "address": "fa:16:3e:66:41:1d", "network": {"id": "4b38afb7-2b53-44fc-a4e0-7d79bef71734", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-935140606-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "885cdc1521a14985bfa70ae21e73c693", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c5a52bd-c7", "ovs_interfaceid": "5c5a52bd-c710-4614-8353-55be240cfa17", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:02:49 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-be88b8ef-426d-48ff-a2f3-01ff1a3f69b5 req-e0a7df01-ea5b-4cda-9b5c-e73056bbca01 service nova] Releasing lock "refresh_cache-4a44d9f3-28b2-45e7-b952-2bb1735ef5b5" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 14:02:49 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:02:49 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:02:50 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:02:50 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:02:50 user nova-compute[71474]: DEBUG nova.compute.manager [req-7defd574-2058-4f7a-b455-3599e8670fa1 req-bb63a4d1-74d0-4c99-8a7d-c0bd5fc750ed service nova] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] Received event network-vif-plugged-5c5a52bd-c710-4614-8353-55be240cfa17 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:02:50 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-7defd574-2058-4f7a-b455-3599e8670fa1 req-bb63a4d1-74d0-4c99-8a7d-c0bd5fc750ed service nova] Acquiring lock "4a44d9f3-28b2-45e7-b952-2bb1735ef5b5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:02:50 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-7defd574-2058-4f7a-b455-3599e8670fa1 req-bb63a4d1-74d0-4c99-8a7d-c0bd5fc750ed service nova] Lock "4a44d9f3-28b2-45e7-b952-2bb1735ef5b5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:02:50 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-7defd574-2058-4f7a-b455-3599e8670fa1 req-bb63a4d1-74d0-4c99-8a7d-c0bd5fc750ed service nova] Lock "4a44d9f3-28b2-45e7-b952-2bb1735ef5b5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:02:50 user nova-compute[71474]: DEBUG nova.compute.manager [req-7defd574-2058-4f7a-b455-3599e8670fa1 req-bb63a4d1-74d0-4c99-8a7d-c0bd5fc750ed service nova] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] No waiting events found dispatching network-vif-plugged-5c5a52bd-c710-4614-8353-55be240cfa17 {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:02:50 user nova-compute[71474]: WARNING nova.compute.manager [req-7defd574-2058-4f7a-b455-3599e8670fa1 req-bb63a4d1-74d0-4c99-8a7d-c0bd5fc750ed service nova] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] Received unexpected event network-vif-plugged-5c5a52bd-c710-4614-8353-55be240cfa17 for instance with vm_state building and task_state spawning. Apr 21 14:02:50 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:02:51 user nova-compute[71474]: DEBUG nova.compute.manager [req-b8244a85-b0a5-423c-8e85-af214268dc21 req-220d9f67-3fa8-4a6d-b2b0-fd5797a15b0d service nova] [instance: 96ecc039-866c-4a11-969f-cb59bd0a4f66] Received event network-changed-94f4ff5c-4646-4e4d-814a-40e8e72ad32e {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:02:51 user nova-compute[71474]: DEBUG nova.compute.manager [req-b8244a85-b0a5-423c-8e85-af214268dc21 req-220d9f67-3fa8-4a6d-b2b0-fd5797a15b0d service nova] [instance: 96ecc039-866c-4a11-969f-cb59bd0a4f66] Refreshing instance network info cache due to event network-changed-94f4ff5c-4646-4e4d-814a-40e8e72ad32e. {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 21 14:02:51 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-b8244a85-b0a5-423c-8e85-af214268dc21 req-220d9f67-3fa8-4a6d-b2b0-fd5797a15b0d service nova] Acquiring lock "refresh_cache-96ecc039-866c-4a11-969f-cb59bd0a4f66" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 14:02:51 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-b8244a85-b0a5-423c-8e85-af214268dc21 req-220d9f67-3fa8-4a6d-b2b0-fd5797a15b0d service nova] Acquired lock "refresh_cache-96ecc039-866c-4a11-969f-cb59bd0a4f66" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 14:02:51 user nova-compute[71474]: DEBUG nova.network.neutron [req-b8244a85-b0a5-423c-8e85-af214268dc21 req-220d9f67-3fa8-4a6d-b2b0-fd5797a15b0d service nova] [instance: 96ecc039-866c-4a11-969f-cb59bd0a4f66] Refreshing network info cache for port 94f4ff5c-4646-4e4d-814a-40e8e72ad32e {{(pid=71474) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 21 14:02:51 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:02:51 user nova-compute[71474]: DEBUG nova.compute.manager [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] Instance event wait completed in 0 seconds for {{(pid=71474) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 21 14:02:51 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] Guest created on hypervisor {{(pid=71474) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 21 14:02:51 user nova-compute[71474]: DEBUG nova.virt.driver [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Emitting event Resumed> {{(pid=71474) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 14:02:51 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] VM Resumed (Lifecycle Event) Apr 21 14:02:51 user nova-compute[71474]: INFO nova.virt.libvirt.driver [-] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] Instance spawned successfully. Apr 21 14:02:51 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 21 14:02:51 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 14:02:51 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71474) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 14:02:51 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] Found default for hw_cdrom_bus of ide {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:02:51 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] Found default for hw_disk_bus of virtio {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:02:51 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] Found default for hw_input_bus of None {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:02:51 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] Found default for hw_pointer_model of None {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:02:51 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] Found default for hw_video_model of virtio {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:02:51 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] Found default for hw_vif_model of virtio {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:02:51 user nova-compute[71474]: DEBUG nova.network.neutron [req-b8244a85-b0a5-423c-8e85-af214268dc21 req-220d9f67-3fa8-4a6d-b2b0-fd5797a15b0d service nova] [instance: 96ecc039-866c-4a11-969f-cb59bd0a4f66] Updated VIF entry in instance network info cache for port 94f4ff5c-4646-4e4d-814a-40e8e72ad32e. {{(pid=71474) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 21 14:02:51 user nova-compute[71474]: DEBUG nova.network.neutron [req-b8244a85-b0a5-423c-8e85-af214268dc21 req-220d9f67-3fa8-4a6d-b2b0-fd5797a15b0d service nova] [instance: 96ecc039-866c-4a11-969f-cb59bd0a4f66] Updating instance_info_cache with network_info: [{"id": "94f4ff5c-4646-4e4d-814a-40e8e72ad32e", "address": "fa:16:3e:aa:82:d6", "network": {"id": "1815b48e-38a4-4a83-a23b-d7c2ce38a2c3", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1971948253-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.28", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8a8fedc10f324a92aef4142ab7efdd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap94f4ff5c-46", "ovs_interfaceid": "94f4ff5c-4646-4e4d-814a-40e8e72ad32e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:02:51 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 14:02:51 user nova-compute[71474]: DEBUG nova.virt.driver [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Emitting event Started> {{(pid=71474) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 14:02:51 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] VM Started (Lifecycle Event) Apr 21 14:02:51 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-b8244a85-b0a5-423c-8e85-af214268dc21 req-220d9f67-3fa8-4a6d-b2b0-fd5797a15b0d service nova] Releasing lock "refresh_cache-96ecc039-866c-4a11-969f-cb59bd0a4f66" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 14:02:51 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 14:02:51 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71474) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 14:02:52 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 14:02:52 user nova-compute[71474]: INFO nova.compute.manager [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] Took 5.59 seconds to spawn the instance on the hypervisor. Apr 21 14:02:52 user nova-compute[71474]: DEBUG nova.compute.manager [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 14:02:52 user nova-compute[71474]: INFO nova.compute.manager [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] Took 6.20 seconds to build instance. Apr 21 14:02:52 user nova-compute[71474]: DEBUG nova.compute.manager [req-9a0a9ca4-f38a-4fd6-aea5-a82e5afbd378 req-1ef8ff85-d88c-4e06-a70e-a6411b79f785 service nova] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] Received event network-vif-plugged-5c5a52bd-c710-4614-8353-55be240cfa17 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:02:52 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-9a0a9ca4-f38a-4fd6-aea5-a82e5afbd378 req-1ef8ff85-d88c-4e06-a70e-a6411b79f785 service nova] Acquiring lock "4a44d9f3-28b2-45e7-b952-2bb1735ef5b5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:02:52 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-9a0a9ca4-f38a-4fd6-aea5-a82e5afbd378 req-1ef8ff85-d88c-4e06-a70e-a6411b79f785 service nova] Lock "4a44d9f3-28b2-45e7-b952-2bb1735ef5b5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:02:52 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-9a0a9ca4-f38a-4fd6-aea5-a82e5afbd378 req-1ef8ff85-d88c-4e06-a70e-a6411b79f785 service nova] Lock "4a44d9f3-28b2-45e7-b952-2bb1735ef5b5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:02:52 user nova-compute[71474]: DEBUG nova.compute.manager [req-9a0a9ca4-f38a-4fd6-aea5-a82e5afbd378 req-1ef8ff85-d88c-4e06-a70e-a6411b79f785 service nova] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] No waiting events found dispatching network-vif-plugged-5c5a52bd-c710-4614-8353-55be240cfa17 {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:02:52 user nova-compute[71474]: WARNING nova.compute.manager [req-9a0a9ca4-f38a-4fd6-aea5-a82e5afbd378 req-1ef8ff85-d88c-4e06-a70e-a6411b79f785 service nova] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] Received unexpected event network-vif-plugged-5c5a52bd-c710-4614-8353-55be240cfa17 for instance with vm_state active and task_state None. Apr 21 14:02:52 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-a87648c1-fe89-468f-bc4f-777050ea91a0 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Lock "4a44d9f3-28b2-45e7-b952-2bb1735ef5b5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.285s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:02:52 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:02:53 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-89eff8d1-5f21-40b2-b3cb-155290f11b75 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Acquiring lock "96ecc039-866c-4a11-969f-cb59bd0a4f66" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:02:53 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-89eff8d1-5f21-40b2-b3cb-155290f11b75 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Lock "96ecc039-866c-4a11-969f-cb59bd0a4f66" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:02:53 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-89eff8d1-5f21-40b2-b3cb-155290f11b75 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Acquiring lock "96ecc039-866c-4a11-969f-cb59bd0a4f66-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:02:53 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-89eff8d1-5f21-40b2-b3cb-155290f11b75 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Lock "96ecc039-866c-4a11-969f-cb59bd0a4f66-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:02:53 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-89eff8d1-5f21-40b2-b3cb-155290f11b75 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Lock "96ecc039-866c-4a11-969f-cb59bd0a4f66-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:02:53 user nova-compute[71474]: INFO nova.compute.manager [None req-89eff8d1-5f21-40b2-b3cb-155290f11b75 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] [instance: 96ecc039-866c-4a11-969f-cb59bd0a4f66] Terminating instance Apr 21 14:02:53 user nova-compute[71474]: DEBUG nova.compute.manager [None req-89eff8d1-5f21-40b2-b3cb-155290f11b75 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] [instance: 96ecc039-866c-4a11-969f-cb59bd0a4f66] Start destroying the instance on the hypervisor. {{(pid=71474) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 21 14:02:53 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:02:53 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:02:53 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:02:53 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:02:53 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:02:53 user nova-compute[71474]: INFO nova.virt.libvirt.driver [-] [instance: 96ecc039-866c-4a11-969f-cb59bd0a4f66] Instance destroyed successfully. Apr 21 14:02:53 user nova-compute[71474]: DEBUG nova.objects.instance [None req-89eff8d1-5f21-40b2-b3cb-155290f11b75 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Lazy-loading 'resources' on Instance uuid 96ecc039-866c-4a11-969f-cb59bd0a4f66 {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 14:02:53 user nova-compute[71474]: DEBUG nova.virt.libvirt.vif [None req-89eff8d1-5f21-40b2-b3cb-155290f11b75 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T14:01:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-1998626577',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumeshelvetestjson-server-1998626577',id=13,image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEjLzbn3Rtk7UAvXNCb/CqU3NFzUWrQkjNzooEbdsk7L34/ttrJQbQ8G2+mwvgH50DTejh2ROEqL19gr64B+vVPiL7Dti7Dkj0m8tNJC6vM/rbQizA3VE78YsalZDJlEwQ==',key_name='tempest-keypair-33325504',keypairs=,launch_index=0,launched_at=2023-04-21T14:01:07Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='8a8fedc10f324a92aef4142ab7efdd6a',ramdisk_id='',reservation_id='r-n7znc46z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-AttachVolumeShelveTestJSON-2115713901',owner_user_name='tempest-AttachVolumeShelveTestJSON-2115713901-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-21T14:01:07Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='92c19bad528a4c38860a43913b28b85b',uuid=96ecc039-866c-4a11-969f-cb59bd0a4f66,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "94f4ff5c-4646-4e4d-814a-40e8e72ad32e", "address": "fa:16:3e:aa:82:d6", "network": {"id": "1815b48e-38a4-4a83-a23b-d7c2ce38a2c3", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1971948253-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.28", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8a8fedc10f324a92aef4142ab7efdd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap94f4ff5c-46", "ovs_interfaceid": "94f4ff5c-4646-4e4d-814a-40e8e72ad32e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 21 14:02:53 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-89eff8d1-5f21-40b2-b3cb-155290f11b75 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Converting VIF {"id": "94f4ff5c-4646-4e4d-814a-40e8e72ad32e", "address": "fa:16:3e:aa:82:d6", "network": {"id": "1815b48e-38a4-4a83-a23b-d7c2ce38a2c3", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1971948253-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.28", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8a8fedc10f324a92aef4142ab7efdd6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap94f4ff5c-46", "ovs_interfaceid": "94f4ff5c-4646-4e4d-814a-40e8e72ad32e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 14:02:53 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-89eff8d1-5f21-40b2-b3cb-155290f11b75 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:aa:82:d6,bridge_name='br-int',has_traffic_filtering=True,id=94f4ff5c-4646-4e4d-814a-40e8e72ad32e,network=Network(1815b48e-38a4-4a83-a23b-d7c2ce38a2c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap94f4ff5c-46') {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 14:02:53 user nova-compute[71474]: DEBUG os_vif [None req-89eff8d1-5f21-40b2-b3cb-155290f11b75 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:aa:82:d6,bridge_name='br-int',has_traffic_filtering=True,id=94f4ff5c-4646-4e4d-814a-40e8e72ad32e,network=Network(1815b48e-38a4-4a83-a23b-d7c2ce38a2c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap94f4ff5c-46') {{(pid=71474) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 21 14:02:53 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:02:53 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap94f4ff5c-46, bridge=br-int, if_exists=True) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 14:02:53 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:02:53 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 14:02:53 user nova-compute[71474]: INFO os_vif [None req-89eff8d1-5f21-40b2-b3cb-155290f11b75 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:aa:82:d6,bridge_name='br-int',has_traffic_filtering=True,id=94f4ff5c-4646-4e4d-814a-40e8e72ad32e,network=Network(1815b48e-38a4-4a83-a23b-d7c2ce38a2c3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap94f4ff5c-46') Apr 21 14:02:53 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-89eff8d1-5f21-40b2-b3cb-155290f11b75 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] [instance: 96ecc039-866c-4a11-969f-cb59bd0a4f66] Deleting instance files /opt/stack/data/nova/instances/96ecc039-866c-4a11-969f-cb59bd0a4f66_del Apr 21 14:02:53 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-89eff8d1-5f21-40b2-b3cb-155290f11b75 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] [instance: 96ecc039-866c-4a11-969f-cb59bd0a4f66] Deletion of /opt/stack/data/nova/instances/96ecc039-866c-4a11-969f-cb59bd0a4f66_del complete Apr 21 14:02:53 user nova-compute[71474]: INFO nova.compute.manager [None req-89eff8d1-5f21-40b2-b3cb-155290f11b75 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] [instance: 96ecc039-866c-4a11-969f-cb59bd0a4f66] Took 0.66 seconds to destroy the instance on the hypervisor. Apr 21 14:02:53 user nova-compute[71474]: DEBUG oslo.service.loopingcall [None req-89eff8d1-5f21-40b2-b3cb-155290f11b75 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71474) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 21 14:02:53 user nova-compute[71474]: DEBUG nova.compute.manager [-] [instance: 96ecc039-866c-4a11-969f-cb59bd0a4f66] Deallocating network for instance {{(pid=71474) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 21 14:02:53 user nova-compute[71474]: DEBUG nova.network.neutron [-] [instance: 96ecc039-866c-4a11-969f-cb59bd0a4f66] deallocate_for_instance() {{(pid=71474) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 21 14:02:53 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:02:53 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Starting heal instance info cache {{(pid=71474) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 21 14:02:53 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Rebuilding the list of instances to heal {{(pid=71474) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 21 14:02:53 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] [instance: 96ecc039-866c-4a11-969f-cb59bd0a4f66] Skipping network cache update for instance because it is being deleted. {{(pid=71474) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9809}} Apr 21 14:02:53 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquiring lock "refresh_cache-30068c4a-94ed-4b84-9178-0d554326fc68" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 14:02:53 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquired lock "refresh_cache-30068c4a-94ed-4b84-9178-0d554326fc68" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 14:02:53 user nova-compute[71474]: DEBUG nova.network.neutron [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] Forcefully refreshing network info cache for instance {{(pid=71474) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 21 14:02:53 user nova-compute[71474]: DEBUG nova.objects.instance [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lazy-loading 'info_cache' on Instance uuid 30068c4a-94ed-4b84-9178-0d554326fc68 {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 14:02:54 user nova-compute[71474]: DEBUG nova.compute.manager [req-48fb964f-4d58-4621-8566-5c7791b52cef req-7e7ca3da-6335-4452-a8c7-52cac351acb2 service nova] [instance: 96ecc039-866c-4a11-969f-cb59bd0a4f66] Received event network-vif-unplugged-94f4ff5c-4646-4e4d-814a-40e8e72ad32e {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:02:54 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-48fb964f-4d58-4621-8566-5c7791b52cef req-7e7ca3da-6335-4452-a8c7-52cac351acb2 service nova] Acquiring lock "96ecc039-866c-4a11-969f-cb59bd0a4f66-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:02:54 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-48fb964f-4d58-4621-8566-5c7791b52cef req-7e7ca3da-6335-4452-a8c7-52cac351acb2 service nova] Lock "96ecc039-866c-4a11-969f-cb59bd0a4f66-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:02:54 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-48fb964f-4d58-4621-8566-5c7791b52cef req-7e7ca3da-6335-4452-a8c7-52cac351acb2 service nova] Lock "96ecc039-866c-4a11-969f-cb59bd0a4f66-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:02:54 user nova-compute[71474]: DEBUG nova.compute.manager [req-48fb964f-4d58-4621-8566-5c7791b52cef req-7e7ca3da-6335-4452-a8c7-52cac351acb2 service nova] [instance: 96ecc039-866c-4a11-969f-cb59bd0a4f66] No waiting events found dispatching network-vif-unplugged-94f4ff5c-4646-4e4d-814a-40e8e72ad32e {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:02:54 user nova-compute[71474]: DEBUG nova.compute.manager [req-48fb964f-4d58-4621-8566-5c7791b52cef req-7e7ca3da-6335-4452-a8c7-52cac351acb2 service nova] [instance: 96ecc039-866c-4a11-969f-cb59bd0a4f66] Received event network-vif-unplugged-94f4ff5c-4646-4e4d-814a-40e8e72ad32e for instance with task_state deleting. {{(pid=71474) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 21 14:02:54 user nova-compute[71474]: DEBUG nova.compute.manager [req-48fb964f-4d58-4621-8566-5c7791b52cef req-7e7ca3da-6335-4452-a8c7-52cac351acb2 service nova] [instance: 96ecc039-866c-4a11-969f-cb59bd0a4f66] Received event network-vif-plugged-94f4ff5c-4646-4e4d-814a-40e8e72ad32e {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:02:54 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-48fb964f-4d58-4621-8566-5c7791b52cef req-7e7ca3da-6335-4452-a8c7-52cac351acb2 service nova] Acquiring lock "96ecc039-866c-4a11-969f-cb59bd0a4f66-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:02:54 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-48fb964f-4d58-4621-8566-5c7791b52cef req-7e7ca3da-6335-4452-a8c7-52cac351acb2 service nova] Lock "96ecc039-866c-4a11-969f-cb59bd0a4f66-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:02:54 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-48fb964f-4d58-4621-8566-5c7791b52cef req-7e7ca3da-6335-4452-a8c7-52cac351acb2 service nova] Lock "96ecc039-866c-4a11-969f-cb59bd0a4f66-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:02:54 user nova-compute[71474]: DEBUG nova.compute.manager [req-48fb964f-4d58-4621-8566-5c7791b52cef req-7e7ca3da-6335-4452-a8c7-52cac351acb2 service nova] [instance: 96ecc039-866c-4a11-969f-cb59bd0a4f66] No waiting events found dispatching network-vif-plugged-94f4ff5c-4646-4e4d-814a-40e8e72ad32e {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:02:54 user nova-compute[71474]: WARNING nova.compute.manager [req-48fb964f-4d58-4621-8566-5c7791b52cef req-7e7ca3da-6335-4452-a8c7-52cac351acb2 service nova] [instance: 96ecc039-866c-4a11-969f-cb59bd0a4f66] Received unexpected event network-vif-plugged-94f4ff5c-4646-4e4d-814a-40e8e72ad32e for instance with vm_state active and task_state deleting. Apr 21 14:02:54 user nova-compute[71474]: DEBUG nova.network.neutron [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] Updating instance_info_cache with network_info: [{"id": "7361228d-9a8e-4921-9cb8-fc59a0a45063", "address": "fa:16:3e:3c:01:3d", "network": {"id": "d567294b-c36b-4268-af90-17560e0c43e4", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1033838809-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a8c210480b33473c91156b798bcbd8b2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7361228d-9a", "ovs_interfaceid": "7361228d-9a8e-4921-9cb8-fc59a0a45063", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:02:54 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Releasing lock "refresh_cache-30068c4a-94ed-4b84-9178-0d554326fc68" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 14:02:54 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] Updated the network info_cache for instance {{(pid=71474) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 21 14:02:54 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:02:54 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:02:54 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:02:54 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:02:54 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Auditing locally available compute resources for user (node: user) {{(pid=71474) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 21 14:02:54 user nova-compute[71474]: DEBUG nova.network.neutron [-] [instance: 96ecc039-866c-4a11-969f-cb59bd0a4f66] Updating instance_info_cache with network_info: [] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:02:54 user nova-compute[71474]: INFO nova.compute.manager [-] [instance: 96ecc039-866c-4a11-969f-cb59bd0a4f66] Took 0.97 seconds to deallocate network for instance. Apr 21 14:02:54 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4a44d9f3-28b2-45e7-b952-2bb1735ef5b5/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:02:54 user nova-compute[71474]: DEBUG nova.compute.manager [req-cf7c6fac-c895-4038-9fa8-c9c05245ec6d req-9a20255b-069b-4059-81f6-7167262ac2e9 service nova] [instance: 96ecc039-866c-4a11-969f-cb59bd0a4f66] Received event network-vif-deleted-94f4ff5c-4646-4e4d-814a-40e8e72ad32e {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:02:54 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-89eff8d1-5f21-40b2-b3cb-155290f11b75 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:02:54 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-89eff8d1-5f21-40b2-b3cb-155290f11b75 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:02:54 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4a44d9f3-28b2-45e7-b952-2bb1735ef5b5/disk --force-share --output=json" returned: 0 in 0.161s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:02:54 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4a44d9f3-28b2-45e7-b952-2bb1735ef5b5/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:02:54 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-89eff8d1-5f21-40b2-b3cb-155290f11b75 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Inventory has not changed in ProviderTree for provider: 4e62c1ab-67bb-43ed-8389-61deb50e98d7 {{(pid=71474) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 14:02:54 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-89eff8d1-5f21-40b2-b3cb-155290f11b75 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Inventory has not changed for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 14:02:54 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4a44d9f3-28b2-45e7-b952-2bb1735ef5b5/disk --force-share --output=json" returned: 0 in 0.134s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:02:54 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5e502c4c-a46b-4670-acba-2fda2d05adf5/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:02:55 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-89eff8d1-5f21-40b2-b3cb-155290f11b75 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.299s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:02:55 user nova-compute[71474]: INFO nova.scheduler.client.report [None req-89eff8d1-5f21-40b2-b3cb-155290f11b75 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Deleted allocations for instance 96ecc039-866c-4a11-969f-cb59bd0a4f66 Apr 21 14:02:55 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5e502c4c-a46b-4670-acba-2fda2d05adf5/disk --force-share --output=json" returned: 0 in 0.142s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:02:55 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5e502c4c-a46b-4670-acba-2fda2d05adf5/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:02:55 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-89eff8d1-5f21-40b2-b3cb-155290f11b75 tempest-AttachVolumeShelveTestJSON-2115713901 tempest-AttachVolumeShelveTestJSON-2115713901-project-member] Lock "96ecc039-866c-4a11-969f-cb59bd0a4f66" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.184s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:02:55 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5e502c4c-a46b-4670-acba-2fda2d05adf5/disk --force-share --output=json" returned: 0 in 0.144s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:02:55 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/30068c4a-94ed-4b84-9178-0d554326fc68/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:02:55 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/30068c4a-94ed-4b84-9178-0d554326fc68/disk --force-share --output=json" returned: 0 in 0.142s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:02:55 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/30068c4a-94ed-4b84-9178-0d554326fc68/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:02:55 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/30068c4a-94ed-4b84-9178-0d554326fc68/disk --force-share --output=json" returned: 0 in 0.144s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:02:55 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f0f32b68-6993-4843-bcc6-bd0e06377b27/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:02:55 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f0f32b68-6993-4843-bcc6-bd0e06377b27/disk --force-share --output=json" returned: 0 in 0.193s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:02:55 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f0f32b68-6993-4843-bcc6-bd0e06377b27/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:02:55 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:02:55 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f0f32b68-6993-4843-bcc6-bd0e06377b27/disk --force-share --output=json" returned: 0 in 0.147s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:02:56 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 14:02:56 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 14:02:56 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Hypervisor/Node resource view: name=user free_ram=8744MB free_disk=26.1468505859375GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71474) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 21 14:02:56 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:02:56 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:02:56 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Instance 30068c4a-94ed-4b84-9178-0d554326fc68 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71474) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 14:02:56 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Instance f0f32b68-6993-4843-bcc6-bd0e06377b27 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71474) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 14:02:56 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Instance 5e502c4c-a46b-4670-acba-2fda2d05adf5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71474) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 14:02:56 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Instance 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71474) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 14:02:56 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Total usable vcpus: 12, total allocated vcpus: 4 {{(pid=71474) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 21 14:02:56 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Final resource view: name=user phys_ram=16023MB used_ram=1024MB phys_disk=40GB used_disk=4GB total_vcpus=12 used_vcpus=4 pci_stats=[] {{(pid=71474) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 21 14:02:56 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Inventory has not changed in ProviderTree for provider: 4e62c1ab-67bb-43ed-8389-61deb50e98d7 {{(pid=71474) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 14:02:56 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Inventory has not changed for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 14:02:56 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Compute_service record updated for user:user {{(pid=71474) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 21 14:02:56 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.328s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:02:57 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:02:57 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:02:57 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:02:57 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:02:58 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:02:58 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:02:58 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71474) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 21 14:02:59 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:03:00 user nova-compute[71474]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71474) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 14:03:00 user nova-compute[71474]: INFO nova.compute.manager [-] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] VM Stopped (Lifecycle Event) Apr 21 14:03:00 user nova-compute[71474]: DEBUG nova.compute.manager [None req-76b2bbaa-64c7-47e5-af3d-b2915ceb13b5 None None] [instance: ef0a7b15-eab4-4705-9f70-9c9117736eb1] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 14:03:00 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:03:03 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:03:03 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:03:05 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:03:06 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:03:08 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Acquiring lock "a205a2a4-c0de-4c5c-abc4-7b034070e014" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:03:08 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Lock "a205a2a4-c0de-4c5c-abc4-7b034070e014" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:03:08 user nova-compute[71474]: DEBUG nova.compute.manager [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: a205a2a4-c0de-4c5c-abc4-7b034070e014] Starting instance... {{(pid=71474) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 21 14:03:08 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:03:08 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:03:08 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71474) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 21 14:03:08 user nova-compute[71474]: INFO nova.compute.claims [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: a205a2a4-c0de-4c5c-abc4-7b034070e014] Claim successful on node user Apr 21 14:03:08 user nova-compute[71474]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71474) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 14:03:08 user nova-compute[71474]: INFO nova.compute.manager [-] [instance: 96ecc039-866c-4a11-969f-cb59bd0a4f66] VM Stopped (Lifecycle Event) Apr 21 14:03:08 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Inventory has not changed in ProviderTree for provider: 4e62c1ab-67bb-43ed-8389-61deb50e98d7 {{(pid=71474) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 14:03:08 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Inventory has not changed for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 14:03:08 user nova-compute[71474]: DEBUG nova.compute.manager [None req-720f3a57-295b-4c1c-a72a-4bd1c7d6d3dc None None] [instance: 96ecc039-866c-4a11-969f-cb59bd0a4f66] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 14:03:08 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:03:08 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.310s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:03:08 user nova-compute[71474]: DEBUG nova.compute.manager [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: a205a2a4-c0de-4c5c-abc4-7b034070e014] Start building networks asynchronously for instance. {{(pid=71474) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 21 14:03:08 user nova-compute[71474]: DEBUG nova.compute.manager [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: a205a2a4-c0de-4c5c-abc4-7b034070e014] Allocating IP information in the background. {{(pid=71474) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 21 14:03:08 user nova-compute[71474]: DEBUG nova.network.neutron [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: a205a2a4-c0de-4c5c-abc4-7b034070e014] allocate_for_instance() {{(pid=71474) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 21 14:03:08 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: a205a2a4-c0de-4c5c-abc4-7b034070e014] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 21 14:03:08 user nova-compute[71474]: DEBUG nova.compute.manager [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: a205a2a4-c0de-4c5c-abc4-7b034070e014] Start building block device mappings for instance. {{(pid=71474) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 21 14:03:08 user nova-compute[71474]: DEBUG nova.compute.manager [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: a205a2a4-c0de-4c5c-abc4-7b034070e014] Start spawning the instance on the hypervisor. {{(pid=71474) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 21 14:03:08 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: a205a2a4-c0de-4c5c-abc4-7b034070e014] Creating instance directory {{(pid=71474) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 21 14:03:08 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: a205a2a4-c0de-4c5c-abc4-7b034070e014] Creating image(s) Apr 21 14:03:08 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Acquiring lock "/opt/stack/data/nova/instances/a205a2a4-c0de-4c5c-abc4-7b034070e014/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:03:08 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Lock "/opt/stack/data/nova/instances/a205a2a4-c0de-4c5c-abc4-7b034070e014/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:03:08 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Lock "/opt/stack/data/nova/instances/a205a2a4-c0de-4c5c-abc4-7b034070e014/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:03:08 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:03:08 user nova-compute[71474]: DEBUG nova.policy [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4df58f0cb48f4aa29df57f9c2f632782', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '432a123307454a44922597d6c9089447', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71474) authorize /opt/stack/nova/nova/policy.py:203}} Apr 21 14:03:09 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json" returned: 0 in 0.139s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:03:09 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Acquiring lock "8e8c288cb98f22f6af31ad55f38b7baa81c260d7" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:03:09 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Lock "8e8c288cb98f22f6af31ad55f38b7baa81c260d7" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:03:09 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:03:09 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Acquiring lock "80eb182f-948b-42d3-999b-339c5d615a73" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:03:09 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Lock "80eb182f-948b-42d3-999b-339c5d615a73" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.002s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:03:09 user nova-compute[71474]: DEBUG nova.compute.manager [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Starting instance... {{(pid=71474) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 21 14:03:09 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json" returned: 0 in 0.130s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:03:09 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7,backing_fmt=raw /opt/stack/data/nova/instances/a205a2a4-c0de-4c5c-abc4-7b034070e014/disk 1073741824 {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:03:09 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7,backing_fmt=raw /opt/stack/data/nova/instances/a205a2a4-c0de-4c5c-abc4-7b034070e014/disk 1073741824" returned: 0 in 0.049s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:03:09 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Lock "8e8c288cb98f22f6af31ad55f38b7baa81c260d7" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.189s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:03:09 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:03:09 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:03:09 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.003s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:03:09 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71474) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 21 14:03:09 user nova-compute[71474]: INFO nova.compute.claims [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Claim successful on node user Apr 21 14:03:09 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json" returned: 0 in 0.131s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:03:09 user nova-compute[71474]: DEBUG nova.virt.disk.api [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Checking if we can resize image /opt/stack/data/nova/instances/a205a2a4-c0de-4c5c-abc4-7b034070e014/disk. size=1073741824 {{(pid=71474) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 21 14:03:09 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a205a2a4-c0de-4c5c-abc4-7b034070e014/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:03:09 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a205a2a4-c0de-4c5c-abc4-7b034070e014/disk --force-share --output=json" returned: 0 in 0.140s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:03:09 user nova-compute[71474]: DEBUG nova.virt.disk.api [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Cannot resize image /opt/stack/data/nova/instances/a205a2a4-c0de-4c5c-abc4-7b034070e014/disk to a smaller size. {{(pid=71474) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 21 14:03:09 user nova-compute[71474]: DEBUG nova.objects.instance [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Lazy-loading 'migration_context' on Instance uuid a205a2a4-c0de-4c5c-abc4-7b034070e014 {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 14:03:09 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: a205a2a4-c0de-4c5c-abc4-7b034070e014] Created local disks {{(pid=71474) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 21 14:03:09 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: a205a2a4-c0de-4c5c-abc4-7b034070e014] Ensure instance console log exists: /opt/stack/data/nova/instances/a205a2a4-c0de-4c5c-abc4-7b034070e014/console.log {{(pid=71474) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 21 14:03:09 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:03:09 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:03:09 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:03:09 user nova-compute[71474]: DEBUG nova.network.neutron [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: a205a2a4-c0de-4c5c-abc4-7b034070e014] Successfully created port: 10363ff5-34d7-4af3-bd72-c7cb78d665c9 {{(pid=71474) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 21 14:03:09 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Inventory has not changed in ProviderTree for provider: 4e62c1ab-67bb-43ed-8389-61deb50e98d7 {{(pid=71474) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 14:03:09 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Inventory has not changed for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 14:03:09 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.368s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:03:09 user nova-compute[71474]: DEBUG nova.compute.manager [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Start building networks asynchronously for instance. {{(pid=71474) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 21 14:03:09 user nova-compute[71474]: DEBUG nova.compute.manager [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Allocating IP information in the background. {{(pid=71474) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 21 14:03:09 user nova-compute[71474]: DEBUG nova.network.neutron [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] allocate_for_instance() {{(pid=71474) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 21 14:03:09 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 21 14:03:09 user nova-compute[71474]: DEBUG nova.compute.manager [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Start building block device mappings for instance. {{(pid=71474) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 21 14:03:09 user nova-compute[71474]: DEBUG nova.policy [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4df58f0cb48f4aa29df57f9c2f632782', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '432a123307454a44922597d6c9089447', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71474) authorize /opt/stack/nova/nova/policy.py:203}} Apr 21 14:03:09 user nova-compute[71474]: DEBUG nova.compute.manager [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Start spawning the instance on the hypervisor. {{(pid=71474) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 21 14:03:09 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Creating instance directory {{(pid=71474) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 21 14:03:09 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Creating image(s) Apr 21 14:03:09 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Acquiring lock "/opt/stack/data/nova/instances/80eb182f-948b-42d3-999b-339c5d615a73/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:03:09 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Lock "/opt/stack/data/nova/instances/80eb182f-948b-42d3-999b-339c5d615a73/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:03:09 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Lock "/opt/stack/data/nova/instances/80eb182f-948b-42d3-999b-339c5d615a73/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:03:09 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:03:10 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json" returned: 0 in 0.157s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:03:10 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Acquiring lock "8e8c288cb98f22f6af31ad55f38b7baa81c260d7" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:03:10 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Lock "8e8c288cb98f22f6af31ad55f38b7baa81c260d7" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:03:10 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:03:10 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-771584ca-2050-4d18-9421-540be7661a0f tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Acquiring lock "f0f32b68-6993-4843-bcc6-bd0e06377b27" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:03:10 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-771584ca-2050-4d18-9421-540be7661a0f tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Lock "f0f32b68-6993-4843-bcc6-bd0e06377b27" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:03:10 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-771584ca-2050-4d18-9421-540be7661a0f tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Acquiring lock "f0f32b68-6993-4843-bcc6-bd0e06377b27-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:03:10 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-771584ca-2050-4d18-9421-540be7661a0f tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Lock "f0f32b68-6993-4843-bcc6-bd0e06377b27-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:03:10 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-771584ca-2050-4d18-9421-540be7661a0f tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Lock "f0f32b68-6993-4843-bcc6-bd0e06377b27-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:03:10 user nova-compute[71474]: INFO nova.compute.manager [None req-771584ca-2050-4d18-9421-540be7661a0f tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] Terminating instance Apr 21 14:03:10 user nova-compute[71474]: DEBUG nova.compute.manager [None req-771584ca-2050-4d18-9421-540be7661a0f tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] Start destroying the instance on the hypervisor. {{(pid=71474) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 21 14:03:10 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json" returned: 0 in 0.149s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:03:10 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7,backing_fmt=raw /opt/stack/data/nova/instances/80eb182f-948b-42d3-999b-339c5d615a73/disk 1073741824 {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:03:10 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:03:10 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7,backing_fmt=raw /opt/stack/data/nova/instances/80eb182f-948b-42d3-999b-339c5d615a73/disk 1073741824" returned: 0 in 0.050s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:03:10 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Lock "8e8c288cb98f22f6af31ad55f38b7baa81c260d7" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.202s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:03:10 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:03:10 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:03:10 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:03:10 user nova-compute[71474]: DEBUG nova.network.neutron [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: a205a2a4-c0de-4c5c-abc4-7b034070e014] Successfully updated port: 10363ff5-34d7-4af3-bd72-c7cb78d665c9 {{(pid=71474) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 21 14:03:10 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:03:10 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json" returned: 0 in 0.146s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:03:10 user nova-compute[71474]: DEBUG nova.virt.disk.api [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Checking if we can resize image /opt/stack/data/nova/instances/80eb182f-948b-42d3-999b-339c5d615a73/disk. size=1073741824 {{(pid=71474) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 21 14:03:10 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/80eb182f-948b-42d3-999b-339c5d615a73/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:03:10 user nova-compute[71474]: DEBUG nova.compute.manager [req-e4643efe-447e-49bc-b87d-25028193d3a3 req-659ccdd0-e5ae-4a56-a061-95fd84085bb3 service nova] [instance: a205a2a4-c0de-4c5c-abc4-7b034070e014] Received event network-changed-10363ff5-34d7-4af3-bd72-c7cb78d665c9 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:03:10 user nova-compute[71474]: DEBUG nova.compute.manager [req-e4643efe-447e-49bc-b87d-25028193d3a3 req-659ccdd0-e5ae-4a56-a061-95fd84085bb3 service nova] [instance: a205a2a4-c0de-4c5c-abc4-7b034070e014] Refreshing instance network info cache due to event network-changed-10363ff5-34d7-4af3-bd72-c7cb78d665c9. {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 21 14:03:10 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-e4643efe-447e-49bc-b87d-25028193d3a3 req-659ccdd0-e5ae-4a56-a061-95fd84085bb3 service nova] Acquiring lock "refresh_cache-a205a2a4-c0de-4c5c-abc4-7b034070e014" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 14:03:10 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-e4643efe-447e-49bc-b87d-25028193d3a3 req-659ccdd0-e5ae-4a56-a061-95fd84085bb3 service nova] Acquired lock "refresh_cache-a205a2a4-c0de-4c5c-abc4-7b034070e014" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 14:03:10 user nova-compute[71474]: DEBUG nova.network.neutron [req-e4643efe-447e-49bc-b87d-25028193d3a3 req-659ccdd0-e5ae-4a56-a061-95fd84085bb3 service nova] [instance: a205a2a4-c0de-4c5c-abc4-7b034070e014] Refreshing network info cache for port 10363ff5-34d7-4af3-bd72-c7cb78d665c9 {{(pid=71474) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 21 14:03:10 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Acquiring lock "refresh_cache-a205a2a4-c0de-4c5c-abc4-7b034070e014" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 14:03:10 user nova-compute[71474]: DEBUG nova.network.neutron [req-e4643efe-447e-49bc-b87d-25028193d3a3 req-659ccdd0-e5ae-4a56-a061-95fd84085bb3 service nova] [instance: a205a2a4-c0de-4c5c-abc4-7b034070e014] Instance cache missing network info. {{(pid=71474) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 21 14:03:10 user nova-compute[71474]: DEBUG nova.network.neutron [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Successfully created port: def6080a-bf3f-4516-8140-08f463f69eb7 {{(pid=71474) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 21 14:03:10 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/80eb182f-948b-42d3-999b-339c5d615a73/disk --force-share --output=json" returned: 0 in 0.142s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:03:10 user nova-compute[71474]: DEBUG nova.virt.disk.api [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Cannot resize image /opt/stack/data/nova/instances/80eb182f-948b-42d3-999b-339c5d615a73/disk to a smaller size. {{(pid=71474) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 21 14:03:10 user nova-compute[71474]: DEBUG nova.objects.instance [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Lazy-loading 'migration_context' on Instance uuid 80eb182f-948b-42d3-999b-339c5d615a73 {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 14:03:10 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Created local disks {{(pid=71474) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 21 14:03:10 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Ensure instance console log exists: /opt/stack/data/nova/instances/80eb182f-948b-42d3-999b-339c5d615a73/console.log {{(pid=71474) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 21 14:03:10 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:03:10 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:03:10 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:03:10 user nova-compute[71474]: DEBUG nova.compute.manager [req-7ce8b59a-274b-4ec6-ae84-f640cd8fe06c req-6a063e22-4eec-40de-a662-ff295407d5b0 service nova] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] Received event network-vif-unplugged-20ca5a57-3cd5-47ad-bdfe-f56a0ecd078b {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:03:10 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-7ce8b59a-274b-4ec6-ae84-f640cd8fe06c req-6a063e22-4eec-40de-a662-ff295407d5b0 service nova] Acquiring lock "f0f32b68-6993-4843-bcc6-bd0e06377b27-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:03:10 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-7ce8b59a-274b-4ec6-ae84-f640cd8fe06c req-6a063e22-4eec-40de-a662-ff295407d5b0 service nova] Lock "f0f32b68-6993-4843-bcc6-bd0e06377b27-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:03:10 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-7ce8b59a-274b-4ec6-ae84-f640cd8fe06c req-6a063e22-4eec-40de-a662-ff295407d5b0 service nova] Lock "f0f32b68-6993-4843-bcc6-bd0e06377b27-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:03:10 user nova-compute[71474]: DEBUG nova.compute.manager [req-7ce8b59a-274b-4ec6-ae84-f640cd8fe06c req-6a063e22-4eec-40de-a662-ff295407d5b0 service nova] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] No waiting events found dispatching network-vif-unplugged-20ca5a57-3cd5-47ad-bdfe-f56a0ecd078b {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:03:10 user nova-compute[71474]: DEBUG nova.compute.manager [req-7ce8b59a-274b-4ec6-ae84-f640cd8fe06c req-6a063e22-4eec-40de-a662-ff295407d5b0 service nova] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] Received event network-vif-unplugged-20ca5a57-3cd5-47ad-bdfe-f56a0ecd078b for instance with task_state deleting. {{(pid=71474) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 21 14:03:10 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:03:10 user nova-compute[71474]: DEBUG nova.network.neutron [req-e4643efe-447e-49bc-b87d-25028193d3a3 req-659ccdd0-e5ae-4a56-a061-95fd84085bb3 service nova] [instance: a205a2a4-c0de-4c5c-abc4-7b034070e014] Updating instance_info_cache with network_info: [] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:03:10 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:03:10 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:03:10 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-e4643efe-447e-49bc-b87d-25028193d3a3 req-659ccdd0-e5ae-4a56-a061-95fd84085bb3 service nova] Releasing lock "refresh_cache-a205a2a4-c0de-4c5c-abc4-7b034070e014" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 14:03:10 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:03:10 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Acquired lock "refresh_cache-a205a2a4-c0de-4c5c-abc4-7b034070e014" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 14:03:10 user nova-compute[71474]: DEBUG nova.network.neutron [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: a205a2a4-c0de-4c5c-abc4-7b034070e014] Building network info cache for instance {{(pid=71474) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 21 14:03:10 user nova-compute[71474]: INFO nova.virt.libvirt.driver [-] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] Instance destroyed successfully. Apr 21 14:03:10 user nova-compute[71474]: DEBUG nova.objects.instance [None req-771584ca-2050-4d18-9421-540be7661a0f tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Lazy-loading 'resources' on Instance uuid f0f32b68-6993-4843-bcc6-bd0e06377b27 {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 14:03:10 user nova-compute[71474]: DEBUG nova.virt.libvirt.vif [None req-771584ca-2050-4d18-9421-540be7661a0f tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T13:58:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-VolumesAdminNegativeTest-server-1193021950',display_name='tempest-VolumesAdminNegativeTest-server-1193021950',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesadminnegativetest-server-1193021950',id=6,image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOhmqZ33jzOJUNp5cIjTjD2V4mwqGnUNtzXSj78uvtldCN9y9LKEaKBdycKDs4VYN2v9RCyHrUj9yHjgYAuNS07yjzech5h1dSQg5dt5ELnEas6naL+mLGQFJzls0JQplQ==',key_name='tempest-keypair-1507777159',keypairs=,launch_index=0,launched_at=2023-04-21T13:58:43Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='15f83d6d2c3049e9ba1ac7f04ad2ebb0',ramdisk_id='',reservation_id='r-a6d80ryc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-VolumesAdminNegativeTest-1182596808',owner_user_name='tempest-VolumesAdminNegativeTest-1182596808-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-21T13:58:44Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='b60caf53ee58417cb76a77c963a45ec2',uuid=f0f32b68-6993-4843-bcc6-bd0e06377b27,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "20ca5a57-3cd5-47ad-bdfe-f56a0ecd078b", "address": "fa:16:3e:02:78:94", "network": {"id": "6e372a6f-6444-4977-be86-7a6bb86d8979", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-2058149994-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.41", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "15f83d6d2c3049e9ba1ac7f04ad2ebb0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap20ca5a57-3c", "ovs_interfaceid": "20ca5a57-3cd5-47ad-bdfe-f56a0ecd078b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 21 14:03:10 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-771584ca-2050-4d18-9421-540be7661a0f tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Converting VIF {"id": "20ca5a57-3cd5-47ad-bdfe-f56a0ecd078b", "address": "fa:16:3e:02:78:94", "network": {"id": "6e372a6f-6444-4977-be86-7a6bb86d8979", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-2058149994-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.41", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "15f83d6d2c3049e9ba1ac7f04ad2ebb0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap20ca5a57-3c", "ovs_interfaceid": "20ca5a57-3cd5-47ad-bdfe-f56a0ecd078b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 14:03:10 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-771584ca-2050-4d18-9421-540be7661a0f tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:02:78:94,bridge_name='br-int',has_traffic_filtering=True,id=20ca5a57-3cd5-47ad-bdfe-f56a0ecd078b,network=Network(6e372a6f-6444-4977-be86-7a6bb86d8979),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap20ca5a57-3c') {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 14:03:10 user nova-compute[71474]: DEBUG os_vif [None req-771584ca-2050-4d18-9421-540be7661a0f tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:02:78:94,bridge_name='br-int',has_traffic_filtering=True,id=20ca5a57-3cd5-47ad-bdfe-f56a0ecd078b,network=Network(6e372a6f-6444-4977-be86-7a6bb86d8979),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap20ca5a57-3c') {{(pid=71474) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 21 14:03:10 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:03:10 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap20ca5a57-3c, bridge=br-int, if_exists=True) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 14:03:10 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:03:10 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 14:03:10 user nova-compute[71474]: INFO os_vif [None req-771584ca-2050-4d18-9421-540be7661a0f tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:02:78:94,bridge_name='br-int',has_traffic_filtering=True,id=20ca5a57-3cd5-47ad-bdfe-f56a0ecd078b,network=Network(6e372a6f-6444-4977-be86-7a6bb86d8979),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap20ca5a57-3c') Apr 21 14:03:10 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-771584ca-2050-4d18-9421-540be7661a0f tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] Deleting instance files /opt/stack/data/nova/instances/f0f32b68-6993-4843-bcc6-bd0e06377b27_del Apr 21 14:03:10 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-771584ca-2050-4d18-9421-540be7661a0f tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] Deletion of /opt/stack/data/nova/instances/f0f32b68-6993-4843-bcc6-bd0e06377b27_del complete Apr 21 14:03:10 user nova-compute[71474]: DEBUG nova.network.neutron [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: a205a2a4-c0de-4c5c-abc4-7b034070e014] Instance cache missing network info. {{(pid=71474) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 21 14:03:11 user nova-compute[71474]: INFO nova.compute.manager [None req-771584ca-2050-4d18-9421-540be7661a0f tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] Took 0.88 seconds to destroy the instance on the hypervisor. Apr 21 14:03:11 user nova-compute[71474]: DEBUG oslo.service.loopingcall [None req-771584ca-2050-4d18-9421-540be7661a0f tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71474) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 21 14:03:11 user nova-compute[71474]: DEBUG nova.compute.manager [-] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] Deallocating network for instance {{(pid=71474) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 21 14:03:11 user nova-compute[71474]: DEBUG nova.network.neutron [-] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] deallocate_for_instance() {{(pid=71474) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 21 14:03:11 user nova-compute[71474]: DEBUG nova.network.neutron [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: a205a2a4-c0de-4c5c-abc4-7b034070e014] Updating instance_info_cache with network_info: [{"id": "10363ff5-34d7-4af3-bd72-c7cb78d665c9", "address": "fa:16:3e:ae:cb:ca", "network": {"id": "6942adb6-1e24-4361-9a43-e8b692767b1f", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1408422178-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "432a123307454a44922597d6c9089447", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap10363ff5-34", "ovs_interfaceid": "10363ff5-34d7-4af3-bd72-c7cb78d665c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:03:11 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Releasing lock "refresh_cache-a205a2a4-c0de-4c5c-abc4-7b034070e014" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 14:03:11 user nova-compute[71474]: DEBUG nova.compute.manager [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: a205a2a4-c0de-4c5c-abc4-7b034070e014] Instance network_info: |[{"id": "10363ff5-34d7-4af3-bd72-c7cb78d665c9", "address": "fa:16:3e:ae:cb:ca", "network": {"id": "6942adb6-1e24-4361-9a43-e8b692767b1f", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1408422178-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "432a123307454a44922597d6c9089447", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap10363ff5-34", "ovs_interfaceid": "10363ff5-34d7-4af3-bd72-c7cb78d665c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71474) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 21 14:03:11 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: a205a2a4-c0de-4c5c-abc4-7b034070e014] Start _get_guest_xml network_info=[{"id": "10363ff5-34d7-4af3-bd72-c7cb78d665c9", "address": "fa:16:3e:ae:cb:ca", "network": {"id": "6942adb6-1e24-4361-9a43-e8b692767b1f", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1408422178-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "432a123307454a44922597d6c9089447", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap10363ff5-34", "ovs_interfaceid": "10363ff5-34d7-4af3-bd72-c7cb78d665c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T13:54:16Z,direct_url=,disk_format='qcow2',id=2edfef44-2867-4e03-a53e-b139f99afa75,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='36a44032fda748c1965c722304fa176d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T13:54:18Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'guest_format': None, 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'image_id': '2edfef44-2867-4e03-a53e-b139f99afa75'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71474) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 21 14:03:11 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 14:03:11 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 14:03:11 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71474) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 21 14:03:11 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-21T13:55:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T13:54:16Z,direct_url=,disk_format='qcow2',id=2edfef44-2867-4e03-a53e-b139f99afa75,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='36a44032fda748c1965c722304fa176d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T13:54:18Z,virtual_size=,visibility=), allow threads: True {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 21 14:03:11 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Flavor limits 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 21 14:03:11 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Image limits 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 21 14:03:11 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Flavor pref 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 21 14:03:11 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Image pref 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 21 14:03:11 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 21 14:03:11 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 21 14:03:11 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71474) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 21 14:03:11 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Got 1 possible topologies {{(pid=71474) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 21 14:03:11 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 21 14:03:11 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 21 14:03:11 user nova-compute[71474]: DEBUG nova.virt.libvirt.vif [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T14:03:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-1976108837',display_name='tempest-ServerRescueNegativeTestJSON-server-1976108837',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverrescuenegativetestjson-server-1976108837',id=16,image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='432a123307454a44922597d6c9089447',ramdisk_id='',reservation_id='r-bqqswuy3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-193683719',owner_user_name='tempest-ServerRescueNegativeTestJSON-193683719-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T14:03:09Z,user_data=None,user_id='4df58f0cb48f4aa29df57f9c2f632782',uuid=a205a2a4-c0de-4c5c-abc4-7b034070e014,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "10363ff5-34d7-4af3-bd72-c7cb78d665c9", "address": "fa:16:3e:ae:cb:ca", "network": {"id": "6942adb6-1e24-4361-9a43-e8b692767b1f", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1408422178-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "432a123307454a44922597d6c9089447", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap10363ff5-34", "ovs_interfaceid": "10363ff5-34d7-4af3-bd72-c7cb78d665c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71474) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 21 14:03:11 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Converting VIF {"id": "10363ff5-34d7-4af3-bd72-c7cb78d665c9", "address": "fa:16:3e:ae:cb:ca", "network": {"id": "6942adb6-1e24-4361-9a43-e8b692767b1f", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1408422178-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "432a123307454a44922597d6c9089447", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap10363ff5-34", "ovs_interfaceid": "10363ff5-34d7-4af3-bd72-c7cb78d665c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 14:03:11 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ae:cb:ca,bridge_name='br-int',has_traffic_filtering=True,id=10363ff5-34d7-4af3-bd72-c7cb78d665c9,network=Network(6942adb6-1e24-4361-9a43-e8b692767b1f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap10363ff5-34') {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 14:03:11 user nova-compute[71474]: DEBUG nova.objects.instance [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Lazy-loading 'pci_devices' on Instance uuid a205a2a4-c0de-4c5c-abc4-7b034070e014 {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 14:03:11 user nova-compute[71474]: DEBUG nova.network.neutron [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Successfully updated port: def6080a-bf3f-4516-8140-08f463f69eb7 {{(pid=71474) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 21 14:03:11 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: a205a2a4-c0de-4c5c-abc4-7b034070e014] End _get_guest_xml xml= Apr 21 14:03:11 user nova-compute[71474]: a205a2a4-c0de-4c5c-abc4-7b034070e014 Apr 21 14:03:11 user nova-compute[71474]: instance-00000010 Apr 21 14:03:11 user nova-compute[71474]: 131072 Apr 21 14:03:11 user nova-compute[71474]: 1 Apr 21 14:03:11 user nova-compute[71474]: Apr 21 14:03:11 user nova-compute[71474]: Apr 21 14:03:11 user nova-compute[71474]: Apr 21 14:03:11 user nova-compute[71474]: tempest-ServerRescueNegativeTestJSON-server-1976108837 Apr 21 14:03:11 user nova-compute[71474]: 2023-04-21 14:03:11 Apr 21 14:03:11 user nova-compute[71474]: Apr 21 14:03:11 user nova-compute[71474]: 128 Apr 21 14:03:11 user nova-compute[71474]: 1 Apr 21 14:03:11 user nova-compute[71474]: 0 Apr 21 14:03:11 user nova-compute[71474]: 0 Apr 21 14:03:11 user nova-compute[71474]: 1 Apr 21 14:03:11 user nova-compute[71474]: Apr 21 14:03:11 user nova-compute[71474]: Apr 21 14:03:11 user nova-compute[71474]: tempest-ServerRescueNegativeTestJSON-193683719-project-member Apr 21 14:03:11 user nova-compute[71474]: tempest-ServerRescueNegativeTestJSON-193683719 Apr 21 14:03:11 user nova-compute[71474]: Apr 21 14:03:11 user nova-compute[71474]: Apr 21 14:03:11 user nova-compute[71474]: Apr 21 14:03:11 user nova-compute[71474]: Apr 21 14:03:11 user nova-compute[71474]: Apr 21 14:03:11 user nova-compute[71474]: Apr 21 14:03:11 user nova-compute[71474]: Apr 21 14:03:11 user nova-compute[71474]: Apr 21 14:03:11 user nova-compute[71474]: Apr 21 14:03:11 user nova-compute[71474]: Apr 21 14:03:11 user nova-compute[71474]: Apr 21 14:03:11 user nova-compute[71474]: OpenStack Foundation Apr 21 14:03:11 user nova-compute[71474]: OpenStack Nova Apr 21 14:03:11 user nova-compute[71474]: 0.0.0 Apr 21 14:03:11 user nova-compute[71474]: a205a2a4-c0de-4c5c-abc4-7b034070e014 Apr 21 14:03:11 user nova-compute[71474]: a205a2a4-c0de-4c5c-abc4-7b034070e014 Apr 21 14:03:11 user nova-compute[71474]: Virtual Machine Apr 21 14:03:11 user nova-compute[71474]: Apr 21 14:03:11 user nova-compute[71474]: Apr 21 14:03:11 user nova-compute[71474]: Apr 21 14:03:11 user nova-compute[71474]: hvm Apr 21 14:03:11 user nova-compute[71474]: Apr 21 14:03:11 user nova-compute[71474]: Apr 21 14:03:11 user nova-compute[71474]: Apr 21 14:03:11 user nova-compute[71474]: Apr 21 14:03:11 user nova-compute[71474]: Apr 21 14:03:11 user nova-compute[71474]: Apr 21 14:03:11 user nova-compute[71474]: Apr 21 14:03:11 user nova-compute[71474]: Apr 21 14:03:11 user nova-compute[71474]: Apr 21 14:03:11 user nova-compute[71474]: Apr 21 14:03:11 user nova-compute[71474]: Apr 21 14:03:11 user nova-compute[71474]: Apr 21 14:03:11 user nova-compute[71474]: Apr 21 14:03:11 user nova-compute[71474]: Apr 21 14:03:11 user nova-compute[71474]: Nehalem Apr 21 14:03:11 user nova-compute[71474]: Apr 21 14:03:11 user nova-compute[71474]: Apr 21 14:03:11 user nova-compute[71474]: Apr 21 14:03:11 user nova-compute[71474]: Apr 21 14:03:11 user nova-compute[71474]: Apr 21 14:03:11 user nova-compute[71474]: Apr 21 14:03:11 user nova-compute[71474]: Apr 21 14:03:11 user nova-compute[71474]: Apr 21 14:03:11 user nova-compute[71474]: Apr 21 14:03:11 user nova-compute[71474]: Apr 21 14:03:11 user nova-compute[71474]: Apr 21 14:03:11 user nova-compute[71474]: Apr 21 14:03:11 user nova-compute[71474]: Apr 21 14:03:11 user nova-compute[71474]: Apr 21 14:03:11 user nova-compute[71474]: Apr 21 14:03:11 user nova-compute[71474]: Apr 21 14:03:11 user nova-compute[71474]: Apr 21 14:03:11 user nova-compute[71474]: Apr 21 14:03:11 user nova-compute[71474]: Apr 21 14:03:11 user nova-compute[71474]: Apr 21 14:03:11 user nova-compute[71474]: /dev/urandom Apr 21 14:03:11 user nova-compute[71474]: Apr 21 14:03:11 user nova-compute[71474]: Apr 21 14:03:11 user nova-compute[71474]: Apr 21 14:03:11 user nova-compute[71474]: Apr 21 14:03:11 user nova-compute[71474]: Apr 21 14:03:11 user nova-compute[71474]: Apr 21 14:03:11 user nova-compute[71474]: Apr 21 14:03:11 user nova-compute[71474]: {{(pid=71474) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 21 14:03:11 user nova-compute[71474]: DEBUG nova.virt.libvirt.vif [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T14:03:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-1976108837',display_name='tempest-ServerRescueNegativeTestJSON-server-1976108837',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverrescuenegativetestjson-server-1976108837',id=16,image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='432a123307454a44922597d6c9089447',ramdisk_id='',reservation_id='r-bqqswuy3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-193683719',owner_user_name='tempest-ServerRescueNegativeTestJSON-193683719-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T14:03:09Z,user_data=None,user_id='4df58f0cb48f4aa29df57f9c2f632782',uuid=a205a2a4-c0de-4c5c-abc4-7b034070e014,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "10363ff5-34d7-4af3-bd72-c7cb78d665c9", "address": "fa:16:3e:ae:cb:ca", "network": {"id": "6942adb6-1e24-4361-9a43-e8b692767b1f", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1408422178-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "432a123307454a44922597d6c9089447", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap10363ff5-34", "ovs_interfaceid": "10363ff5-34d7-4af3-bd72-c7cb78d665c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 21 14:03:11 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Converting VIF {"id": "10363ff5-34d7-4af3-bd72-c7cb78d665c9", "address": "fa:16:3e:ae:cb:ca", "network": {"id": "6942adb6-1e24-4361-9a43-e8b692767b1f", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1408422178-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "432a123307454a44922597d6c9089447", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap10363ff5-34", "ovs_interfaceid": "10363ff5-34d7-4af3-bd72-c7cb78d665c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 14:03:11 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ae:cb:ca,bridge_name='br-int',has_traffic_filtering=True,id=10363ff5-34d7-4af3-bd72-c7cb78d665c9,network=Network(6942adb6-1e24-4361-9a43-e8b692767b1f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap10363ff5-34') {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 14:03:11 user nova-compute[71474]: DEBUG os_vif [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ae:cb:ca,bridge_name='br-int',has_traffic_filtering=True,id=10363ff5-34d7-4af3-bd72-c7cb78d665c9,network=Network(6942adb6-1e24-4361-9a43-e8b692767b1f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap10363ff5-34') {{(pid=71474) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 21 14:03:11 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:03:11 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 14:03:11 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 21 14:03:11 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Acquiring lock "refresh_cache-80eb182f-948b-42d3-999b-339c5d615a73" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 14:03:11 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Acquired lock "refresh_cache-80eb182f-948b-42d3-999b-339c5d615a73" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 14:03:11 user nova-compute[71474]: DEBUG nova.network.neutron [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Building network info cache for instance {{(pid=71474) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 21 14:03:11 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:03:11 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap10363ff5-34, may_exist=True) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 14:03:11 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap10363ff5-34, col_values=(('external_ids', {'iface-id': '10363ff5-34d7-4af3-bd72-c7cb78d665c9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ae:cb:ca', 'vm-uuid': 'a205a2a4-c0de-4c5c-abc4-7b034070e014'}),)) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 14:03:11 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:03:11 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 14:03:11 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:03:11 user nova-compute[71474]: INFO os_vif [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ae:cb:ca,bridge_name='br-int',has_traffic_filtering=True,id=10363ff5-34d7-4af3-bd72-c7cb78d665c9,network=Network(6942adb6-1e24-4361-9a43-e8b692767b1f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap10363ff5-34') Apr 21 14:03:11 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:03:11 user nova-compute[71474]: DEBUG nova.network.neutron [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Instance cache missing network info. {{(pid=71474) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 21 14:03:11 user nova-compute[71474]: DEBUG nova.compute.manager [req-3c577b1c-3488-4271-9b47-2f1bfb0c7f9b req-8043eec9-6af1-4314-9ed7-e712c53c2353 service nova] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Received event network-changed-def6080a-bf3f-4516-8140-08f463f69eb7 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:03:11 user nova-compute[71474]: DEBUG nova.compute.manager [req-3c577b1c-3488-4271-9b47-2f1bfb0c7f9b req-8043eec9-6af1-4314-9ed7-e712c53c2353 service nova] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Refreshing instance network info cache due to event network-changed-def6080a-bf3f-4516-8140-08f463f69eb7. {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 21 14:03:11 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-3c577b1c-3488-4271-9b47-2f1bfb0c7f9b req-8043eec9-6af1-4314-9ed7-e712c53c2353 service nova] Acquiring lock "refresh_cache-80eb182f-948b-42d3-999b-339c5d615a73" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 14:03:11 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:03:11 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] No BDM found with device name vda, not building metadata. {{(pid=71474) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 21 14:03:11 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] No VIF found with MAC fa:16:3e:ae:cb:ca, not building metadata {{(pid=71474) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 21 14:03:11 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:03:11 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:03:11 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:03:11 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:03:12 user nova-compute[71474]: DEBUG nova.network.neutron [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Updating instance_info_cache with network_info: [{"id": "def6080a-bf3f-4516-8140-08f463f69eb7", "address": "fa:16:3e:ff:23:c8", "network": {"id": "6942adb6-1e24-4361-9a43-e8b692767b1f", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1408422178-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "432a123307454a44922597d6c9089447", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapdef6080a-bf", "ovs_interfaceid": "def6080a-bf3f-4516-8140-08f463f69eb7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:03:12 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Releasing lock "refresh_cache-80eb182f-948b-42d3-999b-339c5d615a73" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 14:03:12 user nova-compute[71474]: DEBUG nova.compute.manager [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Instance network_info: |[{"id": "def6080a-bf3f-4516-8140-08f463f69eb7", "address": "fa:16:3e:ff:23:c8", "network": {"id": "6942adb6-1e24-4361-9a43-e8b692767b1f", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1408422178-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "432a123307454a44922597d6c9089447", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapdef6080a-bf", "ovs_interfaceid": "def6080a-bf3f-4516-8140-08f463f69eb7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71474) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 21 14:03:12 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-3c577b1c-3488-4271-9b47-2f1bfb0c7f9b req-8043eec9-6af1-4314-9ed7-e712c53c2353 service nova] Acquired lock "refresh_cache-80eb182f-948b-42d3-999b-339c5d615a73" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 14:03:12 user nova-compute[71474]: DEBUG nova.network.neutron [req-3c577b1c-3488-4271-9b47-2f1bfb0c7f9b req-8043eec9-6af1-4314-9ed7-e712c53c2353 service nova] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Refreshing network info cache for port def6080a-bf3f-4516-8140-08f463f69eb7 {{(pid=71474) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 21 14:03:12 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Start _get_guest_xml network_info=[{"id": "def6080a-bf3f-4516-8140-08f463f69eb7", "address": "fa:16:3e:ff:23:c8", "network": {"id": "6942adb6-1e24-4361-9a43-e8b692767b1f", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1408422178-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "432a123307454a44922597d6c9089447", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapdef6080a-bf", "ovs_interfaceid": "def6080a-bf3f-4516-8140-08f463f69eb7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T13:54:16Z,direct_url=,disk_format='qcow2',id=2edfef44-2867-4e03-a53e-b139f99afa75,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='36a44032fda748c1965c722304fa176d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T13:54:18Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'guest_format': None, 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'image_id': '2edfef44-2867-4e03-a53e-b139f99afa75'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71474) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 21 14:03:12 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 14:03:12 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 14:03:12 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71474) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 21 14:03:12 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-21T13:55:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T13:54:16Z,direct_url=,disk_format='qcow2',id=2edfef44-2867-4e03-a53e-b139f99afa75,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='36a44032fda748c1965c722304fa176d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T13:54:18Z,virtual_size=,visibility=), allow threads: True {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 21 14:03:12 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Flavor limits 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 21 14:03:12 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Image limits 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 21 14:03:12 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Flavor pref 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 21 14:03:12 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Image pref 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 21 14:03:12 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 21 14:03:12 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 21 14:03:12 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71474) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 21 14:03:12 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Got 1 possible topologies {{(pid=71474) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 21 14:03:12 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 21 14:03:12 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 21 14:03:12 user nova-compute[71474]: DEBUG nova.virt.libvirt.vif [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T14:03:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-718848885',display_name='tempest-ServerRescueNegativeTestJSON-server-718848885',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverrescuenegativetestjson-server-718848885',id=17,image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='432a123307454a44922597d6c9089447',ramdisk_id='',reservation_id='r-oiv28z2j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-193683719',owner_user_name='tempest-ServerRescueNegativeTestJSON-193683719-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T14:03:10Z,user_data=None,user_id='4df58f0cb48f4aa29df57f9c2f632782',uuid=80eb182f-948b-42d3-999b-339c5d615a73,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "def6080a-bf3f-4516-8140-08f463f69eb7", "address": "fa:16:3e:ff:23:c8", "network": {"id": "6942adb6-1e24-4361-9a43-e8b692767b1f", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1408422178-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "432a123307454a44922597d6c9089447", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapdef6080a-bf", "ovs_interfaceid": "def6080a-bf3f-4516-8140-08f463f69eb7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71474) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 21 14:03:12 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Converting VIF {"id": "def6080a-bf3f-4516-8140-08f463f69eb7", "address": "fa:16:3e:ff:23:c8", "network": {"id": "6942adb6-1e24-4361-9a43-e8b692767b1f", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1408422178-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "432a123307454a44922597d6c9089447", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapdef6080a-bf", "ovs_interfaceid": "def6080a-bf3f-4516-8140-08f463f69eb7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 14:03:12 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ff:23:c8,bridge_name='br-int',has_traffic_filtering=True,id=def6080a-bf3f-4516-8140-08f463f69eb7,network=Network(6942adb6-1e24-4361-9a43-e8b692767b1f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdef6080a-bf') {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 14:03:12 user nova-compute[71474]: DEBUG nova.objects.instance [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Lazy-loading 'pci_devices' on Instance uuid 80eb182f-948b-42d3-999b-339c5d615a73 {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 14:03:12 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] End _get_guest_xml xml= Apr 21 14:03:12 user nova-compute[71474]: 80eb182f-948b-42d3-999b-339c5d615a73 Apr 21 14:03:12 user nova-compute[71474]: instance-00000011 Apr 21 14:03:12 user nova-compute[71474]: 131072 Apr 21 14:03:12 user nova-compute[71474]: 1 Apr 21 14:03:12 user nova-compute[71474]: Apr 21 14:03:12 user nova-compute[71474]: Apr 21 14:03:12 user nova-compute[71474]: Apr 21 14:03:12 user nova-compute[71474]: tempest-ServerRescueNegativeTestJSON-server-718848885 Apr 21 14:03:12 user nova-compute[71474]: 2023-04-21 14:03:12 Apr 21 14:03:12 user nova-compute[71474]: Apr 21 14:03:12 user nova-compute[71474]: 128 Apr 21 14:03:12 user nova-compute[71474]: 1 Apr 21 14:03:12 user nova-compute[71474]: 0 Apr 21 14:03:12 user nova-compute[71474]: 0 Apr 21 14:03:12 user nova-compute[71474]: 1 Apr 21 14:03:12 user nova-compute[71474]: Apr 21 14:03:12 user nova-compute[71474]: Apr 21 14:03:12 user nova-compute[71474]: tempest-ServerRescueNegativeTestJSON-193683719-project-member Apr 21 14:03:12 user nova-compute[71474]: tempest-ServerRescueNegativeTestJSON-193683719 Apr 21 14:03:12 user nova-compute[71474]: Apr 21 14:03:12 user nova-compute[71474]: Apr 21 14:03:12 user nova-compute[71474]: Apr 21 14:03:12 user nova-compute[71474]: Apr 21 14:03:12 user nova-compute[71474]: Apr 21 14:03:12 user nova-compute[71474]: Apr 21 14:03:12 user nova-compute[71474]: Apr 21 14:03:12 user nova-compute[71474]: Apr 21 14:03:12 user nova-compute[71474]: Apr 21 14:03:12 user nova-compute[71474]: Apr 21 14:03:12 user nova-compute[71474]: Apr 21 14:03:12 user nova-compute[71474]: OpenStack Foundation Apr 21 14:03:12 user nova-compute[71474]: OpenStack Nova Apr 21 14:03:12 user nova-compute[71474]: 0.0.0 Apr 21 14:03:12 user nova-compute[71474]: 80eb182f-948b-42d3-999b-339c5d615a73 Apr 21 14:03:12 user nova-compute[71474]: 80eb182f-948b-42d3-999b-339c5d615a73 Apr 21 14:03:12 user nova-compute[71474]: Virtual Machine Apr 21 14:03:12 user nova-compute[71474]: Apr 21 14:03:12 user nova-compute[71474]: Apr 21 14:03:12 user nova-compute[71474]: Apr 21 14:03:12 user nova-compute[71474]: hvm Apr 21 14:03:12 user nova-compute[71474]: Apr 21 14:03:12 user nova-compute[71474]: Apr 21 14:03:12 user nova-compute[71474]: Apr 21 14:03:12 user nova-compute[71474]: Apr 21 14:03:12 user nova-compute[71474]: Apr 21 14:03:12 user nova-compute[71474]: Apr 21 14:03:12 user nova-compute[71474]: Apr 21 14:03:12 user nova-compute[71474]: Apr 21 14:03:12 user nova-compute[71474]: Apr 21 14:03:12 user nova-compute[71474]: Apr 21 14:03:12 user nova-compute[71474]: Apr 21 14:03:12 user nova-compute[71474]: Apr 21 14:03:12 user nova-compute[71474]: Apr 21 14:03:12 user nova-compute[71474]: Apr 21 14:03:12 user nova-compute[71474]: Nehalem Apr 21 14:03:12 user nova-compute[71474]: Apr 21 14:03:12 user nova-compute[71474]: Apr 21 14:03:12 user nova-compute[71474]: Apr 21 14:03:12 user nova-compute[71474]: Apr 21 14:03:12 user nova-compute[71474]: Apr 21 14:03:12 user nova-compute[71474]: Apr 21 14:03:12 user nova-compute[71474]: Apr 21 14:03:12 user nova-compute[71474]: Apr 21 14:03:12 user nova-compute[71474]: Apr 21 14:03:12 user nova-compute[71474]: Apr 21 14:03:12 user nova-compute[71474]: Apr 21 14:03:12 user nova-compute[71474]: Apr 21 14:03:12 user nova-compute[71474]: Apr 21 14:03:12 user nova-compute[71474]: Apr 21 14:03:12 user nova-compute[71474]: Apr 21 14:03:12 user nova-compute[71474]: Apr 21 14:03:12 user nova-compute[71474]: Apr 21 14:03:12 user nova-compute[71474]: Apr 21 14:03:12 user nova-compute[71474]: Apr 21 14:03:12 user nova-compute[71474]: Apr 21 14:03:12 user nova-compute[71474]: /dev/urandom Apr 21 14:03:12 user nova-compute[71474]: Apr 21 14:03:12 user nova-compute[71474]: Apr 21 14:03:12 user nova-compute[71474]: Apr 21 14:03:12 user nova-compute[71474]: Apr 21 14:03:12 user nova-compute[71474]: Apr 21 14:03:12 user nova-compute[71474]: Apr 21 14:03:12 user nova-compute[71474]: Apr 21 14:03:12 user nova-compute[71474]: {{(pid=71474) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 21 14:03:12 user nova-compute[71474]: DEBUG nova.virt.libvirt.vif [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T14:03:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-718848885',display_name='tempest-ServerRescueNegativeTestJSON-server-718848885',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverrescuenegativetestjson-server-718848885',id=17,image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='432a123307454a44922597d6c9089447',ramdisk_id='',reservation_id='r-oiv28z2j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-193683719',owner_user_name='tempest-ServerRescueNegativeTestJSON-193683719-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T14:03:10Z,user_data=None,user_id='4df58f0cb48f4aa29df57f9c2f632782',uuid=80eb182f-948b-42d3-999b-339c5d615a73,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "def6080a-bf3f-4516-8140-08f463f69eb7", "address": "fa:16:3e:ff:23:c8", "network": {"id": "6942adb6-1e24-4361-9a43-e8b692767b1f", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1408422178-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "432a123307454a44922597d6c9089447", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapdef6080a-bf", "ovs_interfaceid": "def6080a-bf3f-4516-8140-08f463f69eb7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 21 14:03:12 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Converting VIF {"id": "def6080a-bf3f-4516-8140-08f463f69eb7", "address": "fa:16:3e:ff:23:c8", "network": {"id": "6942adb6-1e24-4361-9a43-e8b692767b1f", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1408422178-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "432a123307454a44922597d6c9089447", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapdef6080a-bf", "ovs_interfaceid": "def6080a-bf3f-4516-8140-08f463f69eb7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 14:03:12 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ff:23:c8,bridge_name='br-int',has_traffic_filtering=True,id=def6080a-bf3f-4516-8140-08f463f69eb7,network=Network(6942adb6-1e24-4361-9a43-e8b692767b1f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdef6080a-bf') {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 14:03:12 user nova-compute[71474]: DEBUG os_vif [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ff:23:c8,bridge_name='br-int',has_traffic_filtering=True,id=def6080a-bf3f-4516-8140-08f463f69eb7,network=Network(6942adb6-1e24-4361-9a43-e8b692767b1f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdef6080a-bf') {{(pid=71474) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 21 14:03:12 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:03:12 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 14:03:12 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 21 14:03:12 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:03:12 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdef6080a-bf, may_exist=True) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 14:03:12 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapdef6080a-bf, col_values=(('external_ids', {'iface-id': 'def6080a-bf3f-4516-8140-08f463f69eb7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ff:23:c8', 'vm-uuid': '80eb182f-948b-42d3-999b-339c5d615a73'}),)) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 14:03:12 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:03:12 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 14:03:12 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:03:12 user nova-compute[71474]: INFO os_vif [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ff:23:c8,bridge_name='br-int',has_traffic_filtering=True,id=def6080a-bf3f-4516-8140-08f463f69eb7,network=Network(6942adb6-1e24-4361-9a43-e8b692767b1f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdef6080a-bf') Apr 21 14:03:12 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] No BDM found with device name vda, not building metadata. {{(pid=71474) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 21 14:03:12 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] No VIF found with MAC fa:16:3e:ff:23:c8, not building metadata {{(pid=71474) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 21 14:03:12 user nova-compute[71474]: DEBUG nova.network.neutron [-] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] Updating instance_info_cache with network_info: [] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:03:12 user nova-compute[71474]: INFO nova.compute.manager [-] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] Took 1.52 seconds to deallocate network for instance. Apr 21 14:03:12 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-771584ca-2050-4d18-9421-540be7661a0f tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:03:12 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-771584ca-2050-4d18-9421-540be7661a0f tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:03:12 user nova-compute[71474]: DEBUG nova.network.neutron [req-3c577b1c-3488-4271-9b47-2f1bfb0c7f9b req-8043eec9-6af1-4314-9ed7-e712c53c2353 service nova] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Updated VIF entry in instance network info cache for port def6080a-bf3f-4516-8140-08f463f69eb7. {{(pid=71474) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 21 14:03:12 user nova-compute[71474]: DEBUG nova.network.neutron [req-3c577b1c-3488-4271-9b47-2f1bfb0c7f9b req-8043eec9-6af1-4314-9ed7-e712c53c2353 service nova] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Updating instance_info_cache with network_info: [{"id": "def6080a-bf3f-4516-8140-08f463f69eb7", "address": "fa:16:3e:ff:23:c8", "network": {"id": "6942adb6-1e24-4361-9a43-e8b692767b1f", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1408422178-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "432a123307454a44922597d6c9089447", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapdef6080a-bf", "ovs_interfaceid": "def6080a-bf3f-4516-8140-08f463f69eb7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:03:12 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-3c577b1c-3488-4271-9b47-2f1bfb0c7f9b req-8043eec9-6af1-4314-9ed7-e712c53c2353 service nova] Releasing lock "refresh_cache-80eb182f-948b-42d3-999b-339c5d615a73" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 14:03:12 user nova-compute[71474]: DEBUG nova.compute.manager [req-a120e27a-0b4f-4444-a959-6b3dedcb4fb2 req-baa59771-28f0-43a2-af3d-e09ba46a69c3 service nova] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] Received event network-vif-plugged-20ca5a57-3cd5-47ad-bdfe-f56a0ecd078b {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:03:12 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-a120e27a-0b4f-4444-a959-6b3dedcb4fb2 req-baa59771-28f0-43a2-af3d-e09ba46a69c3 service nova] Acquiring lock "f0f32b68-6993-4843-bcc6-bd0e06377b27-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:03:12 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-a120e27a-0b4f-4444-a959-6b3dedcb4fb2 req-baa59771-28f0-43a2-af3d-e09ba46a69c3 service nova] Lock "f0f32b68-6993-4843-bcc6-bd0e06377b27-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:03:12 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-a120e27a-0b4f-4444-a959-6b3dedcb4fb2 req-baa59771-28f0-43a2-af3d-e09ba46a69c3 service nova] Lock "f0f32b68-6993-4843-bcc6-bd0e06377b27-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:03:12 user nova-compute[71474]: DEBUG nova.compute.manager [req-a120e27a-0b4f-4444-a959-6b3dedcb4fb2 req-baa59771-28f0-43a2-af3d-e09ba46a69c3 service nova] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] No waiting events found dispatching network-vif-plugged-20ca5a57-3cd5-47ad-bdfe-f56a0ecd078b {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:03:12 user nova-compute[71474]: WARNING nova.compute.manager [req-a120e27a-0b4f-4444-a959-6b3dedcb4fb2 req-baa59771-28f0-43a2-af3d-e09ba46a69c3 service nova] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] Received unexpected event network-vif-plugged-20ca5a57-3cd5-47ad-bdfe-f56a0ecd078b for instance with vm_state deleted and task_state None. Apr 21 14:03:12 user nova-compute[71474]: DEBUG nova.compute.manager [req-a120e27a-0b4f-4444-a959-6b3dedcb4fb2 req-baa59771-28f0-43a2-af3d-e09ba46a69c3 service nova] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] Received event network-vif-plugged-20ca5a57-3cd5-47ad-bdfe-f56a0ecd078b {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:03:12 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-a120e27a-0b4f-4444-a959-6b3dedcb4fb2 req-baa59771-28f0-43a2-af3d-e09ba46a69c3 service nova] Acquiring lock "f0f32b68-6993-4843-bcc6-bd0e06377b27-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:03:12 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-a120e27a-0b4f-4444-a959-6b3dedcb4fb2 req-baa59771-28f0-43a2-af3d-e09ba46a69c3 service nova] Lock "f0f32b68-6993-4843-bcc6-bd0e06377b27-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:03:12 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-a120e27a-0b4f-4444-a959-6b3dedcb4fb2 req-baa59771-28f0-43a2-af3d-e09ba46a69c3 service nova] Lock "f0f32b68-6993-4843-bcc6-bd0e06377b27-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:03:12 user nova-compute[71474]: DEBUG nova.compute.manager [req-a120e27a-0b4f-4444-a959-6b3dedcb4fb2 req-baa59771-28f0-43a2-af3d-e09ba46a69c3 service nova] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] No waiting events found dispatching network-vif-plugged-20ca5a57-3cd5-47ad-bdfe-f56a0ecd078b {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:03:12 user nova-compute[71474]: WARNING nova.compute.manager [req-a120e27a-0b4f-4444-a959-6b3dedcb4fb2 req-baa59771-28f0-43a2-af3d-e09ba46a69c3 service nova] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] Received unexpected event network-vif-plugged-20ca5a57-3cd5-47ad-bdfe-f56a0ecd078b for instance with vm_state deleted and task_state None. Apr 21 14:03:12 user nova-compute[71474]: DEBUG nova.compute.manager [req-a120e27a-0b4f-4444-a959-6b3dedcb4fb2 req-baa59771-28f0-43a2-af3d-e09ba46a69c3 service nova] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] Received event network-vif-plugged-20ca5a57-3cd5-47ad-bdfe-f56a0ecd078b {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:03:12 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-a120e27a-0b4f-4444-a959-6b3dedcb4fb2 req-baa59771-28f0-43a2-af3d-e09ba46a69c3 service nova] Acquiring lock "f0f32b68-6993-4843-bcc6-bd0e06377b27-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:03:12 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-a120e27a-0b4f-4444-a959-6b3dedcb4fb2 req-baa59771-28f0-43a2-af3d-e09ba46a69c3 service nova] Lock "f0f32b68-6993-4843-bcc6-bd0e06377b27-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:03:12 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-a120e27a-0b4f-4444-a959-6b3dedcb4fb2 req-baa59771-28f0-43a2-af3d-e09ba46a69c3 service nova] Lock "f0f32b68-6993-4843-bcc6-bd0e06377b27-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:03:12 user nova-compute[71474]: DEBUG nova.compute.manager [req-a120e27a-0b4f-4444-a959-6b3dedcb4fb2 req-baa59771-28f0-43a2-af3d-e09ba46a69c3 service nova] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] No waiting events found dispatching network-vif-plugged-20ca5a57-3cd5-47ad-bdfe-f56a0ecd078b {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:03:12 user nova-compute[71474]: WARNING nova.compute.manager [req-a120e27a-0b4f-4444-a959-6b3dedcb4fb2 req-baa59771-28f0-43a2-af3d-e09ba46a69c3 service nova] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] Received unexpected event network-vif-plugged-20ca5a57-3cd5-47ad-bdfe-f56a0ecd078b for instance with vm_state deleted and task_state None. Apr 21 14:03:12 user nova-compute[71474]: DEBUG nova.compute.manager [req-a120e27a-0b4f-4444-a959-6b3dedcb4fb2 req-baa59771-28f0-43a2-af3d-e09ba46a69c3 service nova] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] Received event network-vif-unplugged-20ca5a57-3cd5-47ad-bdfe-f56a0ecd078b {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:03:12 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-a120e27a-0b4f-4444-a959-6b3dedcb4fb2 req-baa59771-28f0-43a2-af3d-e09ba46a69c3 service nova] Acquiring lock "f0f32b68-6993-4843-bcc6-bd0e06377b27-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:03:12 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-a120e27a-0b4f-4444-a959-6b3dedcb4fb2 req-baa59771-28f0-43a2-af3d-e09ba46a69c3 service nova] Lock "f0f32b68-6993-4843-bcc6-bd0e06377b27-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:03:12 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-a120e27a-0b4f-4444-a959-6b3dedcb4fb2 req-baa59771-28f0-43a2-af3d-e09ba46a69c3 service nova] Lock "f0f32b68-6993-4843-bcc6-bd0e06377b27-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:03:12 user nova-compute[71474]: DEBUG nova.compute.manager [req-a120e27a-0b4f-4444-a959-6b3dedcb4fb2 req-baa59771-28f0-43a2-af3d-e09ba46a69c3 service nova] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] No waiting events found dispatching network-vif-unplugged-20ca5a57-3cd5-47ad-bdfe-f56a0ecd078b {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:03:12 user nova-compute[71474]: WARNING nova.compute.manager [req-a120e27a-0b4f-4444-a959-6b3dedcb4fb2 req-baa59771-28f0-43a2-af3d-e09ba46a69c3 service nova] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] Received unexpected event network-vif-unplugged-20ca5a57-3cd5-47ad-bdfe-f56a0ecd078b for instance with vm_state deleted and task_state None. Apr 21 14:03:12 user nova-compute[71474]: DEBUG nova.compute.manager [req-a120e27a-0b4f-4444-a959-6b3dedcb4fb2 req-baa59771-28f0-43a2-af3d-e09ba46a69c3 service nova] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] Received event network-vif-plugged-20ca5a57-3cd5-47ad-bdfe-f56a0ecd078b {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:03:12 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-a120e27a-0b4f-4444-a959-6b3dedcb4fb2 req-baa59771-28f0-43a2-af3d-e09ba46a69c3 service nova] Acquiring lock "f0f32b68-6993-4843-bcc6-bd0e06377b27-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:03:12 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-a120e27a-0b4f-4444-a959-6b3dedcb4fb2 req-baa59771-28f0-43a2-af3d-e09ba46a69c3 service nova] Lock "f0f32b68-6993-4843-bcc6-bd0e06377b27-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:03:12 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-a120e27a-0b4f-4444-a959-6b3dedcb4fb2 req-baa59771-28f0-43a2-af3d-e09ba46a69c3 service nova] Lock "f0f32b68-6993-4843-bcc6-bd0e06377b27-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:03:12 user nova-compute[71474]: DEBUG nova.compute.manager [req-a120e27a-0b4f-4444-a959-6b3dedcb4fb2 req-baa59771-28f0-43a2-af3d-e09ba46a69c3 service nova] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] No waiting events found dispatching network-vif-plugged-20ca5a57-3cd5-47ad-bdfe-f56a0ecd078b {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:03:12 user nova-compute[71474]: WARNING nova.compute.manager [req-a120e27a-0b4f-4444-a959-6b3dedcb4fb2 req-baa59771-28f0-43a2-af3d-e09ba46a69c3 service nova] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] Received unexpected event network-vif-plugged-20ca5a57-3cd5-47ad-bdfe-f56a0ecd078b for instance with vm_state deleted and task_state None. Apr 21 14:03:12 user nova-compute[71474]: DEBUG nova.compute.manager [req-a120e27a-0b4f-4444-a959-6b3dedcb4fb2 req-baa59771-28f0-43a2-af3d-e09ba46a69c3 service nova] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] Received event network-vif-deleted-20ca5a57-3cd5-47ad-bdfe-f56a0ecd078b {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:03:12 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-771584ca-2050-4d18-9421-540be7661a0f tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Inventory has not changed in ProviderTree for provider: 4e62c1ab-67bb-43ed-8389-61deb50e98d7 {{(pid=71474) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 14:03:12 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-771584ca-2050-4d18-9421-540be7661a0f tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Inventory has not changed for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 14:03:12 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-771584ca-2050-4d18-9421-540be7661a0f tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.263s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:03:12 user nova-compute[71474]: INFO nova.scheduler.client.report [None req-771584ca-2050-4d18-9421-540be7661a0f tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Deleted allocations for instance f0f32b68-6993-4843-bcc6-bd0e06377b27 Apr 21 14:03:13 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-771584ca-2050-4d18-9421-540be7661a0f tempest-VolumesAdminNegativeTest-1182596808 tempest-VolumesAdminNegativeTest-1182596808-project-member] Lock "f0f32b68-6993-4843-bcc6-bd0e06377b27" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.864s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:03:13 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:03:13 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:03:13 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:03:13 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:03:13 user nova-compute[71474]: DEBUG nova.compute.manager [req-38985e9f-32d2-46eb-9bc8-765f78d75d7c req-8ab2dd94-db77-4c37-81bc-52bcbcb05853 service nova] [instance: a205a2a4-c0de-4c5c-abc4-7b034070e014] Received event network-vif-plugged-10363ff5-34d7-4af3-bd72-c7cb78d665c9 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:03:13 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-38985e9f-32d2-46eb-9bc8-765f78d75d7c req-8ab2dd94-db77-4c37-81bc-52bcbcb05853 service nova] Acquiring lock "a205a2a4-c0de-4c5c-abc4-7b034070e014-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:03:13 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-38985e9f-32d2-46eb-9bc8-765f78d75d7c req-8ab2dd94-db77-4c37-81bc-52bcbcb05853 service nova] Lock "a205a2a4-c0de-4c5c-abc4-7b034070e014-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:03:13 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-38985e9f-32d2-46eb-9bc8-765f78d75d7c req-8ab2dd94-db77-4c37-81bc-52bcbcb05853 service nova] Lock "a205a2a4-c0de-4c5c-abc4-7b034070e014-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:03:13 user nova-compute[71474]: DEBUG nova.compute.manager [req-38985e9f-32d2-46eb-9bc8-765f78d75d7c req-8ab2dd94-db77-4c37-81bc-52bcbcb05853 service nova] [instance: a205a2a4-c0de-4c5c-abc4-7b034070e014] No waiting events found dispatching network-vif-plugged-10363ff5-34d7-4af3-bd72-c7cb78d665c9 {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:03:13 user nova-compute[71474]: WARNING nova.compute.manager [req-38985e9f-32d2-46eb-9bc8-765f78d75d7c req-8ab2dd94-db77-4c37-81bc-52bcbcb05853 service nova] [instance: a205a2a4-c0de-4c5c-abc4-7b034070e014] Received unexpected event network-vif-plugged-10363ff5-34d7-4af3-bd72-c7cb78d665c9 for instance with vm_state building and task_state spawning. Apr 21 14:03:13 user nova-compute[71474]: DEBUG nova.compute.manager [req-38985e9f-32d2-46eb-9bc8-765f78d75d7c req-8ab2dd94-db77-4c37-81bc-52bcbcb05853 service nova] [instance: a205a2a4-c0de-4c5c-abc4-7b034070e014] Received event network-vif-plugged-10363ff5-34d7-4af3-bd72-c7cb78d665c9 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:03:13 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-38985e9f-32d2-46eb-9bc8-765f78d75d7c req-8ab2dd94-db77-4c37-81bc-52bcbcb05853 service nova] Acquiring lock "a205a2a4-c0de-4c5c-abc4-7b034070e014-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:03:13 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-38985e9f-32d2-46eb-9bc8-765f78d75d7c req-8ab2dd94-db77-4c37-81bc-52bcbcb05853 service nova] Lock "a205a2a4-c0de-4c5c-abc4-7b034070e014-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:03:13 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-38985e9f-32d2-46eb-9bc8-765f78d75d7c req-8ab2dd94-db77-4c37-81bc-52bcbcb05853 service nova] Lock "a205a2a4-c0de-4c5c-abc4-7b034070e014-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:03:13 user nova-compute[71474]: DEBUG nova.compute.manager [req-38985e9f-32d2-46eb-9bc8-765f78d75d7c req-8ab2dd94-db77-4c37-81bc-52bcbcb05853 service nova] [instance: a205a2a4-c0de-4c5c-abc4-7b034070e014] No waiting events found dispatching network-vif-plugged-10363ff5-34d7-4af3-bd72-c7cb78d665c9 {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:03:13 user nova-compute[71474]: WARNING nova.compute.manager [req-38985e9f-32d2-46eb-9bc8-765f78d75d7c req-8ab2dd94-db77-4c37-81bc-52bcbcb05853 service nova] [instance: a205a2a4-c0de-4c5c-abc4-7b034070e014] Received unexpected event network-vif-plugged-10363ff5-34d7-4af3-bd72-c7cb78d665c9 for instance with vm_state building and task_state spawning. Apr 21 14:03:13 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:03:13 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:03:13 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:03:13 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:03:14 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:03:14 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:03:14 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:03:14 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:03:14 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:03:15 user nova-compute[71474]: DEBUG nova.compute.manager [req-8a30ed1f-a078-4abb-adf2-231154cd0154 req-4d757970-aa4b-4cd2-94a3-93fe1547478c service nova] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Received event network-vif-plugged-def6080a-bf3f-4516-8140-08f463f69eb7 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:03:15 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-8a30ed1f-a078-4abb-adf2-231154cd0154 req-4d757970-aa4b-4cd2-94a3-93fe1547478c service nova] Acquiring lock "80eb182f-948b-42d3-999b-339c5d615a73-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:03:15 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-8a30ed1f-a078-4abb-adf2-231154cd0154 req-4d757970-aa4b-4cd2-94a3-93fe1547478c service nova] Lock "80eb182f-948b-42d3-999b-339c5d615a73-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:03:15 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-8a30ed1f-a078-4abb-adf2-231154cd0154 req-4d757970-aa4b-4cd2-94a3-93fe1547478c service nova] Lock "80eb182f-948b-42d3-999b-339c5d615a73-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:03:15 user nova-compute[71474]: DEBUG nova.compute.manager [req-8a30ed1f-a078-4abb-adf2-231154cd0154 req-4d757970-aa4b-4cd2-94a3-93fe1547478c service nova] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] No waiting events found dispatching network-vif-plugged-def6080a-bf3f-4516-8140-08f463f69eb7 {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:03:15 user nova-compute[71474]: WARNING nova.compute.manager [req-8a30ed1f-a078-4abb-adf2-231154cd0154 req-4d757970-aa4b-4cd2-94a3-93fe1547478c service nova] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Received unexpected event network-vif-plugged-def6080a-bf3f-4516-8140-08f463f69eb7 for instance with vm_state building and task_state spawning. Apr 21 14:03:15 user nova-compute[71474]: DEBUG nova.compute.manager [req-8a30ed1f-a078-4abb-adf2-231154cd0154 req-4d757970-aa4b-4cd2-94a3-93fe1547478c service nova] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Received event network-vif-plugged-def6080a-bf3f-4516-8140-08f463f69eb7 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:03:15 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-8a30ed1f-a078-4abb-adf2-231154cd0154 req-4d757970-aa4b-4cd2-94a3-93fe1547478c service nova] Acquiring lock "80eb182f-948b-42d3-999b-339c5d615a73-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:03:15 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-8a30ed1f-a078-4abb-adf2-231154cd0154 req-4d757970-aa4b-4cd2-94a3-93fe1547478c service nova] Lock "80eb182f-948b-42d3-999b-339c5d615a73-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:03:15 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-8a30ed1f-a078-4abb-adf2-231154cd0154 req-4d757970-aa4b-4cd2-94a3-93fe1547478c service nova] Lock "80eb182f-948b-42d3-999b-339c5d615a73-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:03:15 user nova-compute[71474]: DEBUG nova.compute.manager [req-8a30ed1f-a078-4abb-adf2-231154cd0154 req-4d757970-aa4b-4cd2-94a3-93fe1547478c service nova] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] No waiting events found dispatching network-vif-plugged-def6080a-bf3f-4516-8140-08f463f69eb7 {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:03:15 user nova-compute[71474]: WARNING nova.compute.manager [req-8a30ed1f-a078-4abb-adf2-231154cd0154 req-4d757970-aa4b-4cd2-94a3-93fe1547478c service nova] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Received unexpected event network-vif-plugged-def6080a-bf3f-4516-8140-08f463f69eb7 for instance with vm_state building and task_state spawning. Apr 21 14:03:15 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:03:16 user nova-compute[71474]: DEBUG nova.virt.driver [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Emitting event Resumed> {{(pid=71474) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 14:03:16 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: a205a2a4-c0de-4c5c-abc4-7b034070e014] VM Resumed (Lifecycle Event) Apr 21 14:03:16 user nova-compute[71474]: DEBUG nova.compute.manager [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: a205a2a4-c0de-4c5c-abc4-7b034070e014] Instance event wait completed in 0 seconds for {{(pid=71474) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 21 14:03:16 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: a205a2a4-c0de-4c5c-abc4-7b034070e014] Guest created on hypervisor {{(pid=71474) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 21 14:03:16 user nova-compute[71474]: INFO nova.virt.libvirt.driver [-] [instance: a205a2a4-c0de-4c5c-abc4-7b034070e014] Instance spawned successfully. Apr 21 14:03:16 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: a205a2a4-c0de-4c5c-abc4-7b034070e014] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 21 14:03:16 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: a205a2a4-c0de-4c5c-abc4-7b034070e014] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 14:03:16 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: a205a2a4-c0de-4c5c-abc4-7b034070e014] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71474) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 14:03:16 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: a205a2a4-c0de-4c5c-abc4-7b034070e014] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 14:03:16 user nova-compute[71474]: DEBUG nova.virt.driver [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Emitting event Started> {{(pid=71474) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 14:03:16 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: a205a2a4-c0de-4c5c-abc4-7b034070e014] VM Started (Lifecycle Event) Apr 21 14:03:16 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: a205a2a4-c0de-4c5c-abc4-7b034070e014] Found default for hw_cdrom_bus of ide {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:03:16 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: a205a2a4-c0de-4c5c-abc4-7b034070e014] Found default for hw_disk_bus of virtio {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:03:16 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: a205a2a4-c0de-4c5c-abc4-7b034070e014] Found default for hw_input_bus of None {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:03:16 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: a205a2a4-c0de-4c5c-abc4-7b034070e014] Found default for hw_pointer_model of None {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:03:16 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: a205a2a4-c0de-4c5c-abc4-7b034070e014] Found default for hw_video_model of virtio {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:03:16 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: a205a2a4-c0de-4c5c-abc4-7b034070e014] Found default for hw_vif_model of virtio {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:03:16 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: a205a2a4-c0de-4c5c-abc4-7b034070e014] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 14:03:16 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: a205a2a4-c0de-4c5c-abc4-7b034070e014] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71474) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 14:03:16 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: a205a2a4-c0de-4c5c-abc4-7b034070e014] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 14:03:16 user nova-compute[71474]: INFO nova.compute.manager [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: a205a2a4-c0de-4c5c-abc4-7b034070e014] Took 7.29 seconds to spawn the instance on the hypervisor. Apr 21 14:03:16 user nova-compute[71474]: DEBUG nova.compute.manager [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: a205a2a4-c0de-4c5c-abc4-7b034070e014] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 14:03:16 user nova-compute[71474]: INFO nova.compute.manager [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: a205a2a4-c0de-4c5c-abc4-7b034070e014] Took 7.92 seconds to build instance. Apr 21 14:03:16 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-652cf0df-de15-414f-a288-3d8bd171d651 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Lock "a205a2a4-c0de-4c5c-abc4-7b034070e014" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 8.015s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:03:16 user nova-compute[71474]: DEBUG nova.virt.driver [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Emitting event Resumed> {{(pid=71474) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 14:03:16 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] VM Resumed (Lifecycle Event) Apr 21 14:03:16 user nova-compute[71474]: DEBUG nova.compute.manager [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Instance event wait completed in 0 seconds for {{(pid=71474) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 21 14:03:16 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Guest created on hypervisor {{(pid=71474) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 21 14:03:16 user nova-compute[71474]: INFO nova.virt.libvirt.driver [-] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Instance spawned successfully. Apr 21 14:03:16 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 21 14:03:16 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 14:03:16 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Found default for hw_cdrom_bus of ide {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:03:16 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Found default for hw_disk_bus of virtio {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:03:16 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Found default for hw_input_bus of None {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:03:16 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Found default for hw_pointer_model of None {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:03:16 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Found default for hw_video_model of virtio {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:03:16 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Found default for hw_vif_model of virtio {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:03:16 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71474) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 14:03:16 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 14:03:16 user nova-compute[71474]: DEBUG nova.virt.driver [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Emitting event Started> {{(pid=71474) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 14:03:16 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] VM Started (Lifecycle Event) Apr 21 14:03:16 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 14:03:16 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71474) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 14:03:16 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 14:03:16 user nova-compute[71474]: INFO nova.compute.manager [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Took 6.65 seconds to spawn the instance on the hypervisor. Apr 21 14:03:16 user nova-compute[71474]: DEBUG nova.compute.manager [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 14:03:16 user nova-compute[71474]: INFO nova.compute.manager [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Took 7.42 seconds to build instance. Apr 21 14:03:16 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-84d27e58-e667-4d64-a887-8d04a68185e4 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Lock "80eb182f-948b-42d3-999b-339c5d615a73" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 7.534s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:03:17 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:03:20 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:03:22 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:03:25 user nova-compute[71474]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71474) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 14:03:25 user nova-compute[71474]: INFO nova.compute.manager [-] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] VM Stopped (Lifecycle Event) Apr 21 14:03:25 user nova-compute[71474]: DEBUG nova.compute.manager [None req-98599b41-8747-4b9e-a7fb-571e99bbab64 None None] [instance: f0f32b68-6993-4843-bcc6-bd0e06377b27] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 14:03:27 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:03:30 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:03:32 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:03:35 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:03:37 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:03:38 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:03:40 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:03:42 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:03:45 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:03:47 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:03:47 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:03:52 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:03:52 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:03:52 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:03:52 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Cleaning up deleted instances with incomplete migration {{(pid=71474) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11117}} Apr 21 14:03:53 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:03:53 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:03:53 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:03:54 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:03:54 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:03:54 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:03:54 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:03:54 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:03:54 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Auditing locally available compute resources for user (node: user) {{(pid=71474) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 21 14:03:54 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4a44d9f3-28b2-45e7-b952-2bb1735ef5b5/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:03:55 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4a44d9f3-28b2-45e7-b952-2bb1735ef5b5/disk --force-share --output=json" returned: 0 in 0.145s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:03:55 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4a44d9f3-28b2-45e7-b952-2bb1735ef5b5/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:03:55 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4a44d9f3-28b2-45e7-b952-2bb1735ef5b5/disk --force-share --output=json" returned: 0 in 0.140s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:03:55 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5e502c4c-a46b-4670-acba-2fda2d05adf5/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:03:55 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5e502c4c-a46b-4670-acba-2fda2d05adf5/disk --force-share --output=json" returned: 0 in 0.134s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:03:55 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5e502c4c-a46b-4670-acba-2fda2d05adf5/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:03:55 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5e502c4c-a46b-4670-acba-2fda2d05adf5/disk --force-share --output=json" returned: 0 in 0.149s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:03:55 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a205a2a4-c0de-4c5c-abc4-7b034070e014/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:03:55 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:03:55 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a205a2a4-c0de-4c5c-abc4-7b034070e014/disk --force-share --output=json" returned: 0 in 0.183s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:03:55 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a205a2a4-c0de-4c5c-abc4-7b034070e014/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:03:55 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:03:55 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a205a2a4-c0de-4c5c-abc4-7b034070e014/disk --force-share --output=json" returned: 0 in 0.164s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:03:55 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/80eb182f-948b-42d3-999b-339c5d615a73/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:03:56 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/80eb182f-948b-42d3-999b-339c5d615a73/disk --force-share --output=json" returned: 0 in 0.141s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:03:56 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/80eb182f-948b-42d3-999b-339c5d615a73/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:03:56 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/80eb182f-948b-42d3-999b-339c5d615a73/disk --force-share --output=json" returned: 0 in 0.138s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:03:56 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/30068c4a-94ed-4b84-9178-0d554326fc68/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:03:56 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/30068c4a-94ed-4b84-9178-0d554326fc68/disk --force-share --output=json" returned: 0 in 0.126s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:03:56 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/30068c4a-94ed-4b84-9178-0d554326fc68/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:03:56 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/30068c4a-94ed-4b84-9178-0d554326fc68/disk --force-share --output=json" returned: 0 in 0.128s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:03:56 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 14:03:56 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 14:03:56 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Hypervisor/Node resource view: name=user free_ram=8582MB free_disk=26.093677520751953GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71474) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 21 14:03:56 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:03:56 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:03:57 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Instance 30068c4a-94ed-4b84-9178-0d554326fc68 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71474) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 14:03:57 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Instance 5e502c4c-a46b-4670-acba-2fda2d05adf5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71474) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 14:03:57 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Instance 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71474) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 14:03:57 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Instance a205a2a4-c0de-4c5c-abc4-7b034070e014 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71474) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 14:03:57 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Instance 80eb182f-948b-42d3-999b-339c5d615a73 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71474) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 14:03:57 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Total usable vcpus: 12, total allocated vcpus: 5 {{(pid=71474) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 21 14:03:57 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Final resource view: name=user phys_ram=16023MB used_ram=1152MB phys_disk=40GB used_disk=5GB total_vcpus=12 used_vcpus=5 pci_stats=[] {{(pid=71474) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 21 14:03:57 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Refreshing inventories for resource provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 {{(pid=71474) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} Apr 21 14:03:57 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:03:57 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Updating ProviderTree inventory for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 from _refresh_and_get_inventory using data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} Apr 21 14:03:57 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Updating inventory in ProviderTree for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 with inventory: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} Apr 21 14:03:57 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Refreshing aggregate associations for resource provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7, aggregates: None {{(pid=71474) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} Apr 21 14:03:57 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Refreshing trait associations for resource provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7, traits: COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_GRAPHICS_MODEL_VMVGA,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE2,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_MMX,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_QXL,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSSE3,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE42,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_USB,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_BOCHS {{(pid=71474) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} Apr 21 14:03:57 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Inventory has not changed in ProviderTree for provider: 4e62c1ab-67bb-43ed-8389-61deb50e98d7 {{(pid=71474) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 14:03:57 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Inventory has not changed for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 14:03:57 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Compute_service record updated for user:user {{(pid=71474) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 21 14:03:57 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.597s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:03:57 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:03:57 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Cleaning up deleted instances {{(pid=71474) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11079}} Apr 21 14:03:57 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] There are 0 instances to clean {{(pid=71474) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11088}} Apr 21 14:03:58 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:03:58 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:03:58 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:03:58 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Starting heal instance info cache {{(pid=71474) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 21 14:03:58 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Acquiring lock "aac5a363-5528-4d5f-8c90-6f9ad69a06dd" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:03:58 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Lock "aac5a363-5528-4d5f-8c90-6f9ad69a06dd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:03:58 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquiring lock "refresh_cache-5e502c4c-a46b-4670-acba-2fda2d05adf5" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 14:03:58 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquired lock "refresh_cache-5e502c4c-a46b-4670-acba-2fda2d05adf5" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 14:03:58 user nova-compute[71474]: DEBUG nova.network.neutron [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] Forcefully refreshing network info cache for instance {{(pid=71474) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 21 14:03:58 user nova-compute[71474]: DEBUG nova.compute.manager [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: aac5a363-5528-4d5f-8c90-6f9ad69a06dd] Starting instance... {{(pid=71474) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 21 14:03:58 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:03:58 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:03:58 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71474) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 21 14:03:58 user nova-compute[71474]: INFO nova.compute.claims [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: aac5a363-5528-4d5f-8c90-6f9ad69a06dd] Claim successful on node user Apr 21 14:03:59 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Inventory has not changed in ProviderTree for provider: 4e62c1ab-67bb-43ed-8389-61deb50e98d7 {{(pid=71474) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 14:03:59 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Inventory has not changed for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 14:03:59 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.322s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:03:59 user nova-compute[71474]: DEBUG nova.compute.manager [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: aac5a363-5528-4d5f-8c90-6f9ad69a06dd] Start building networks asynchronously for instance. {{(pid=71474) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 21 14:03:59 user nova-compute[71474]: DEBUG nova.network.neutron [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] Updating instance_info_cache with network_info: [{"id": "9ba354a7-6fb2-4eb1-96f4-edb58950895e", "address": "fa:16:3e:2a:5f:60", "network": {"id": "4b38afb7-2b53-44fc-a4e0-7d79bef71734", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-935140606-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "885cdc1521a14985bfa70ae21e73c693", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ba354a7-6f", "ovs_interfaceid": "9ba354a7-6fb2-4eb1-96f4-edb58950895e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:03:59 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Releasing lock "refresh_cache-5e502c4c-a46b-4670-acba-2fda2d05adf5" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 14:03:59 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] Updated the network info_cache for instance {{(pid=71474) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 21 14:03:59 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:03:59 user nova-compute[71474]: DEBUG nova.compute.manager [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: aac5a363-5528-4d5f-8c90-6f9ad69a06dd] Allocating IP information in the background. {{(pid=71474) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 21 14:03:59 user nova-compute[71474]: DEBUG nova.network.neutron [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: aac5a363-5528-4d5f-8c90-6f9ad69a06dd] allocate_for_instance() {{(pid=71474) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 21 14:03:59 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: aac5a363-5528-4d5f-8c90-6f9ad69a06dd] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 21 14:03:59 user nova-compute[71474]: DEBUG nova.compute.manager [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: aac5a363-5528-4d5f-8c90-6f9ad69a06dd] Start building block device mappings for instance. {{(pid=71474) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 21 14:03:59 user nova-compute[71474]: DEBUG nova.policy [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ab1d2ed7df2f4a9bbf14da7e2c5fece2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f0ccc2c950364fcbb0f2b1cc937f6a82', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71474) authorize /opt/stack/nova/nova/policy.py:203}} Apr 21 14:03:59 user nova-compute[71474]: DEBUG nova.compute.manager [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: aac5a363-5528-4d5f-8c90-6f9ad69a06dd] Start spawning the instance on the hypervisor. {{(pid=71474) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 21 14:03:59 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: aac5a363-5528-4d5f-8c90-6f9ad69a06dd] Creating instance directory {{(pid=71474) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 21 14:03:59 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: aac5a363-5528-4d5f-8c90-6f9ad69a06dd] Creating image(s) Apr 21 14:03:59 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Acquiring lock "/opt/stack/data/nova/instances/aac5a363-5528-4d5f-8c90-6f9ad69a06dd/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:03:59 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Lock "/opt/stack/data/nova/instances/aac5a363-5528-4d5f-8c90-6f9ad69a06dd/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:03:59 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Lock "/opt/stack/data/nova/instances/aac5a363-5528-4d5f-8c90-6f9ad69a06dd/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:03:59 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:03:59 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json" returned: 0 in 0.142s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:03:59 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Acquiring lock "8e8c288cb98f22f6af31ad55f38b7baa81c260d7" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:03:59 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Lock "8e8c288cb98f22f6af31ad55f38b7baa81c260d7" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:03:59 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:03:59 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json" returned: 0 in 0.143s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:03:59 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7,backing_fmt=raw /opt/stack/data/nova/instances/aac5a363-5528-4d5f-8c90-6f9ad69a06dd/disk 1073741824 {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:03:59 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7,backing_fmt=raw /opt/stack/data/nova/instances/aac5a363-5528-4d5f-8c90-6f9ad69a06dd/disk 1073741824" returned: 0 in 0.052s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:03:59 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Lock "8e8c288cb98f22f6af31ad55f38b7baa81c260d7" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.203s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:03:59 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:03:59 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json" returned: 0 in 0.147s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:03:59 user nova-compute[71474]: DEBUG nova.virt.disk.api [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Checking if we can resize image /opt/stack/data/nova/instances/aac5a363-5528-4d5f-8c90-6f9ad69a06dd/disk. size=1073741824 {{(pid=71474) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 21 14:03:59 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/aac5a363-5528-4d5f-8c90-6f9ad69a06dd/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:03:59 user nova-compute[71474]: DEBUG nova.network.neutron [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: aac5a363-5528-4d5f-8c90-6f9ad69a06dd] Successfully created port: 451630a7-eec6-4614-8737-498026e8d671 {{(pid=71474) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 21 14:03:59 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/aac5a363-5528-4d5f-8c90-6f9ad69a06dd/disk --force-share --output=json" returned: 0 in 0.151s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:03:59 user nova-compute[71474]: DEBUG nova.virt.disk.api [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Cannot resize image /opt/stack/data/nova/instances/aac5a363-5528-4d5f-8c90-6f9ad69a06dd/disk to a smaller size. {{(pid=71474) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 21 14:03:59 user nova-compute[71474]: DEBUG nova.objects.instance [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Lazy-loading 'migration_context' on Instance uuid aac5a363-5528-4d5f-8c90-6f9ad69a06dd {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 14:03:59 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: aac5a363-5528-4d5f-8c90-6f9ad69a06dd] Created local disks {{(pid=71474) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 21 14:03:59 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: aac5a363-5528-4d5f-8c90-6f9ad69a06dd] Ensure instance console log exists: /opt/stack/data/nova/instances/aac5a363-5528-4d5f-8c90-6f9ad69a06dd/console.log {{(pid=71474) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 21 14:03:59 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:03:59 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:03:59 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:04:00 user nova-compute[71474]: DEBUG nova.network.neutron [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: aac5a363-5528-4d5f-8c90-6f9ad69a06dd] Successfully updated port: 451630a7-eec6-4614-8737-498026e8d671 {{(pid=71474) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 21 14:04:00 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Acquiring lock "refresh_cache-aac5a363-5528-4d5f-8c90-6f9ad69a06dd" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 14:04:00 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Acquired lock "refresh_cache-aac5a363-5528-4d5f-8c90-6f9ad69a06dd" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 14:04:00 user nova-compute[71474]: DEBUG nova.network.neutron [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: aac5a363-5528-4d5f-8c90-6f9ad69a06dd] Building network info cache for instance {{(pid=71474) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 21 14:04:00 user nova-compute[71474]: DEBUG nova.compute.manager [req-9746c4ef-86e7-4b1b-a0d5-3a1f5487f3e4 req-7a1021f7-e05b-4306-acc5-6794166ccf12 service nova] [instance: aac5a363-5528-4d5f-8c90-6f9ad69a06dd] Received event network-changed-451630a7-eec6-4614-8737-498026e8d671 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:04:00 user nova-compute[71474]: DEBUG nova.compute.manager [req-9746c4ef-86e7-4b1b-a0d5-3a1f5487f3e4 req-7a1021f7-e05b-4306-acc5-6794166ccf12 service nova] [instance: aac5a363-5528-4d5f-8c90-6f9ad69a06dd] Refreshing instance network info cache due to event network-changed-451630a7-eec6-4614-8737-498026e8d671. {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 21 14:04:00 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-9746c4ef-86e7-4b1b-a0d5-3a1f5487f3e4 req-7a1021f7-e05b-4306-acc5-6794166ccf12 service nova] Acquiring lock "refresh_cache-aac5a363-5528-4d5f-8c90-6f9ad69a06dd" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 14:04:00 user nova-compute[71474]: DEBUG nova.network.neutron [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: aac5a363-5528-4d5f-8c90-6f9ad69a06dd] Instance cache missing network info. {{(pid=71474) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 21 14:04:00 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:04:00 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:04:00 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71474) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 21 14:04:00 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:04:01 user nova-compute[71474]: DEBUG nova.network.neutron [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: aac5a363-5528-4d5f-8c90-6f9ad69a06dd] Updating instance_info_cache with network_info: [{"id": "451630a7-eec6-4614-8737-498026e8d671", "address": "fa:16:3e:16:01:e5", "network": {"id": "31b07b9f-0a0f-426a-97d6-12b23e611818", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1809206062-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f0ccc2c950364fcbb0f2b1cc937f6a82", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap451630a7-ee", "ovs_interfaceid": "451630a7-eec6-4614-8737-498026e8d671", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:04:01 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Releasing lock "refresh_cache-aac5a363-5528-4d5f-8c90-6f9ad69a06dd" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 14:04:01 user nova-compute[71474]: DEBUG nova.compute.manager [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: aac5a363-5528-4d5f-8c90-6f9ad69a06dd] Instance network_info: |[{"id": "451630a7-eec6-4614-8737-498026e8d671", "address": "fa:16:3e:16:01:e5", "network": {"id": "31b07b9f-0a0f-426a-97d6-12b23e611818", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1809206062-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f0ccc2c950364fcbb0f2b1cc937f6a82", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap451630a7-ee", "ovs_interfaceid": "451630a7-eec6-4614-8737-498026e8d671", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71474) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 21 14:04:01 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-9746c4ef-86e7-4b1b-a0d5-3a1f5487f3e4 req-7a1021f7-e05b-4306-acc5-6794166ccf12 service nova] Acquired lock "refresh_cache-aac5a363-5528-4d5f-8c90-6f9ad69a06dd" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 14:04:01 user nova-compute[71474]: DEBUG nova.network.neutron [req-9746c4ef-86e7-4b1b-a0d5-3a1f5487f3e4 req-7a1021f7-e05b-4306-acc5-6794166ccf12 service nova] [instance: aac5a363-5528-4d5f-8c90-6f9ad69a06dd] Refreshing network info cache for port 451630a7-eec6-4614-8737-498026e8d671 {{(pid=71474) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 21 14:04:01 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: aac5a363-5528-4d5f-8c90-6f9ad69a06dd] Start _get_guest_xml network_info=[{"id": "451630a7-eec6-4614-8737-498026e8d671", "address": "fa:16:3e:16:01:e5", "network": {"id": "31b07b9f-0a0f-426a-97d6-12b23e611818", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1809206062-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f0ccc2c950364fcbb0f2b1cc937f6a82", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap451630a7-ee", "ovs_interfaceid": "451630a7-eec6-4614-8737-498026e8d671", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T13:54:16Z,direct_url=,disk_format='qcow2',id=2edfef44-2867-4e03-a53e-b139f99afa75,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='36a44032fda748c1965c722304fa176d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T13:54:18Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'guest_format': None, 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'image_id': '2edfef44-2867-4e03-a53e-b139f99afa75'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71474) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 21 14:04:01 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 14:04:01 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 14:04:01 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71474) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 21 14:04:01 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-21T13:55:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T13:54:16Z,direct_url=,disk_format='qcow2',id=2edfef44-2867-4e03-a53e-b139f99afa75,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='36a44032fda748c1965c722304fa176d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T13:54:18Z,virtual_size=,visibility=), allow threads: True {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 21 14:04:01 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Flavor limits 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 21 14:04:01 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Image limits 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 21 14:04:01 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Flavor pref 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 21 14:04:01 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Image pref 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 21 14:04:01 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 21 14:04:01 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 21 14:04:01 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71474) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 21 14:04:01 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Got 1 possible topologies {{(pid=71474) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 21 14:04:01 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 21 14:04:01 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 21 14:04:01 user nova-compute[71474]: DEBUG nova.virt.libvirt.vif [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T14:03:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-2067580371',display_name='tempest-AttachVolumeNegativeTest-server-2067580371',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-2067580371',id=18,image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFlxdy5PoliBeRnt4tWYC7Fu7pkGOage/HG8uxLBIgt7DEFt1QHK8dwArdP14y447xyPamNSB8z6pOZhh9qz3WsCy+knrmjpHD2UmoJzo5C/B6YDf9z6px5GYzSzQaO+nA==',key_name='tempest-keypair-500746232',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f0ccc2c950364fcbb0f2b1cc937f6a82',ramdisk_id='',reservation_id='r-uma60tll',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-166063504',owner_user_name='tempest-AttachVolumeNegativeTest-166063504-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T14:03:59Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='ab1d2ed7df2f4a9bbf14da7e2c5fece2',uuid=aac5a363-5528-4d5f-8c90-6f9ad69a06dd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "451630a7-eec6-4614-8737-498026e8d671", "address": "fa:16:3e:16:01:e5", "network": {"id": "31b07b9f-0a0f-426a-97d6-12b23e611818", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1809206062-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f0ccc2c950364fcbb0f2b1cc937f6a82", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap451630a7-ee", "ovs_interfaceid": "451630a7-eec6-4614-8737-498026e8d671", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71474) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 21 14:04:01 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Converting VIF {"id": "451630a7-eec6-4614-8737-498026e8d671", "address": "fa:16:3e:16:01:e5", "network": {"id": "31b07b9f-0a0f-426a-97d6-12b23e611818", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1809206062-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f0ccc2c950364fcbb0f2b1cc937f6a82", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap451630a7-ee", "ovs_interfaceid": "451630a7-eec6-4614-8737-498026e8d671", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 14:04:01 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:16:01:e5,bridge_name='br-int',has_traffic_filtering=True,id=451630a7-eec6-4614-8737-498026e8d671,network=Network(31b07b9f-0a0f-426a-97d6-12b23e611818),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap451630a7-ee') {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 14:04:01 user nova-compute[71474]: DEBUG nova.objects.instance [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Lazy-loading 'pci_devices' on Instance uuid aac5a363-5528-4d5f-8c90-6f9ad69a06dd {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 14:04:01 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: aac5a363-5528-4d5f-8c90-6f9ad69a06dd] End _get_guest_xml xml= Apr 21 14:04:01 user nova-compute[71474]: aac5a363-5528-4d5f-8c90-6f9ad69a06dd Apr 21 14:04:01 user nova-compute[71474]: instance-00000012 Apr 21 14:04:01 user nova-compute[71474]: 131072 Apr 21 14:04:01 user nova-compute[71474]: 1 Apr 21 14:04:01 user nova-compute[71474]: Apr 21 14:04:01 user nova-compute[71474]: Apr 21 14:04:01 user nova-compute[71474]: Apr 21 14:04:01 user nova-compute[71474]: tempest-AttachVolumeNegativeTest-server-2067580371 Apr 21 14:04:01 user nova-compute[71474]: 2023-04-21 14:04:01 Apr 21 14:04:01 user nova-compute[71474]: Apr 21 14:04:01 user nova-compute[71474]: 128 Apr 21 14:04:01 user nova-compute[71474]: 1 Apr 21 14:04:01 user nova-compute[71474]: 0 Apr 21 14:04:01 user nova-compute[71474]: 0 Apr 21 14:04:01 user nova-compute[71474]: 1 Apr 21 14:04:01 user nova-compute[71474]: Apr 21 14:04:01 user nova-compute[71474]: Apr 21 14:04:01 user nova-compute[71474]: tempest-AttachVolumeNegativeTest-166063504-project-member Apr 21 14:04:01 user nova-compute[71474]: tempest-AttachVolumeNegativeTest-166063504 Apr 21 14:04:01 user nova-compute[71474]: Apr 21 14:04:01 user nova-compute[71474]: Apr 21 14:04:01 user nova-compute[71474]: Apr 21 14:04:01 user nova-compute[71474]: Apr 21 14:04:01 user nova-compute[71474]: Apr 21 14:04:01 user nova-compute[71474]: Apr 21 14:04:01 user nova-compute[71474]: Apr 21 14:04:01 user nova-compute[71474]: Apr 21 14:04:01 user nova-compute[71474]: Apr 21 14:04:01 user nova-compute[71474]: Apr 21 14:04:01 user nova-compute[71474]: Apr 21 14:04:01 user nova-compute[71474]: OpenStack Foundation Apr 21 14:04:01 user nova-compute[71474]: OpenStack Nova Apr 21 14:04:01 user nova-compute[71474]: 0.0.0 Apr 21 14:04:01 user nova-compute[71474]: aac5a363-5528-4d5f-8c90-6f9ad69a06dd Apr 21 14:04:01 user nova-compute[71474]: aac5a363-5528-4d5f-8c90-6f9ad69a06dd Apr 21 14:04:01 user nova-compute[71474]: Virtual Machine Apr 21 14:04:01 user nova-compute[71474]: Apr 21 14:04:01 user nova-compute[71474]: Apr 21 14:04:01 user nova-compute[71474]: Apr 21 14:04:01 user nova-compute[71474]: hvm Apr 21 14:04:01 user nova-compute[71474]: Apr 21 14:04:01 user nova-compute[71474]: Apr 21 14:04:01 user nova-compute[71474]: Apr 21 14:04:01 user nova-compute[71474]: Apr 21 14:04:01 user nova-compute[71474]: Apr 21 14:04:01 user nova-compute[71474]: Apr 21 14:04:01 user nova-compute[71474]: Apr 21 14:04:01 user nova-compute[71474]: Apr 21 14:04:01 user nova-compute[71474]: Apr 21 14:04:01 user nova-compute[71474]: Apr 21 14:04:01 user nova-compute[71474]: Apr 21 14:04:01 user nova-compute[71474]: Apr 21 14:04:01 user nova-compute[71474]: Apr 21 14:04:01 user nova-compute[71474]: Apr 21 14:04:01 user nova-compute[71474]: Nehalem Apr 21 14:04:01 user nova-compute[71474]: Apr 21 14:04:01 user nova-compute[71474]: Apr 21 14:04:01 user nova-compute[71474]: Apr 21 14:04:01 user nova-compute[71474]: Apr 21 14:04:01 user nova-compute[71474]: Apr 21 14:04:01 user nova-compute[71474]: Apr 21 14:04:01 user nova-compute[71474]: Apr 21 14:04:01 user nova-compute[71474]: Apr 21 14:04:01 user nova-compute[71474]: Apr 21 14:04:01 user nova-compute[71474]: Apr 21 14:04:01 user nova-compute[71474]: Apr 21 14:04:01 user nova-compute[71474]: Apr 21 14:04:01 user nova-compute[71474]: Apr 21 14:04:01 user nova-compute[71474]: Apr 21 14:04:01 user nova-compute[71474]: Apr 21 14:04:01 user nova-compute[71474]: Apr 21 14:04:01 user nova-compute[71474]: Apr 21 14:04:01 user nova-compute[71474]: Apr 21 14:04:01 user nova-compute[71474]: Apr 21 14:04:01 user nova-compute[71474]: Apr 21 14:04:01 user nova-compute[71474]: /dev/urandom Apr 21 14:04:01 user nova-compute[71474]: Apr 21 14:04:01 user nova-compute[71474]: Apr 21 14:04:01 user nova-compute[71474]: Apr 21 14:04:01 user nova-compute[71474]: Apr 21 14:04:01 user nova-compute[71474]: Apr 21 14:04:01 user nova-compute[71474]: Apr 21 14:04:01 user nova-compute[71474]: Apr 21 14:04:01 user nova-compute[71474]: {{(pid=71474) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 21 14:04:01 user nova-compute[71474]: DEBUG nova.virt.libvirt.vif [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T14:03:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-2067580371',display_name='tempest-AttachVolumeNegativeTest-server-2067580371',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-2067580371',id=18,image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFlxdy5PoliBeRnt4tWYC7Fu7pkGOage/HG8uxLBIgt7DEFt1QHK8dwArdP14y447xyPamNSB8z6pOZhh9qz3WsCy+knrmjpHD2UmoJzo5C/B6YDf9z6px5GYzSzQaO+nA==',key_name='tempest-keypair-500746232',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f0ccc2c950364fcbb0f2b1cc937f6a82',ramdisk_id='',reservation_id='r-uma60tll',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-166063504',owner_user_name='tempest-AttachVolumeNegativeTest-166063504-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T14:03:59Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='ab1d2ed7df2f4a9bbf14da7e2c5fece2',uuid=aac5a363-5528-4d5f-8c90-6f9ad69a06dd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "451630a7-eec6-4614-8737-498026e8d671", "address": "fa:16:3e:16:01:e5", "network": {"id": "31b07b9f-0a0f-426a-97d6-12b23e611818", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1809206062-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f0ccc2c950364fcbb0f2b1cc937f6a82", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap451630a7-ee", "ovs_interfaceid": "451630a7-eec6-4614-8737-498026e8d671", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 21 14:04:01 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Converting VIF {"id": "451630a7-eec6-4614-8737-498026e8d671", "address": "fa:16:3e:16:01:e5", "network": {"id": "31b07b9f-0a0f-426a-97d6-12b23e611818", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1809206062-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f0ccc2c950364fcbb0f2b1cc937f6a82", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap451630a7-ee", "ovs_interfaceid": "451630a7-eec6-4614-8737-498026e8d671", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 14:04:01 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:16:01:e5,bridge_name='br-int',has_traffic_filtering=True,id=451630a7-eec6-4614-8737-498026e8d671,network=Network(31b07b9f-0a0f-426a-97d6-12b23e611818),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap451630a7-ee') {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 14:04:01 user nova-compute[71474]: DEBUG os_vif [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:16:01:e5,bridge_name='br-int',has_traffic_filtering=True,id=451630a7-eec6-4614-8737-498026e8d671,network=Network(31b07b9f-0a0f-426a-97d6-12b23e611818),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap451630a7-ee') {{(pid=71474) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 21 14:04:01 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:04:01 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 14:04:01 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 21 14:04:01 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:04:01 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap451630a7-ee, may_exist=True) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 14:04:01 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap451630a7-ee, col_values=(('external_ids', {'iface-id': '451630a7-eec6-4614-8737-498026e8d671', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:16:01:e5', 'vm-uuid': 'aac5a363-5528-4d5f-8c90-6f9ad69a06dd'}),)) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 14:04:01 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:04:01 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 14:04:01 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:04:01 user nova-compute[71474]: INFO os_vif [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:16:01:e5,bridge_name='br-int',has_traffic_filtering=True,id=451630a7-eec6-4614-8737-498026e8d671,network=Network(31b07b9f-0a0f-426a-97d6-12b23e611818),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap451630a7-ee') Apr 21 14:04:01 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] No BDM found with device name vda, not building metadata. {{(pid=71474) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 21 14:04:01 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] No VIF found with MAC fa:16:3e:16:01:e5, not building metadata {{(pid=71474) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 21 14:04:01 user nova-compute[71474]: DEBUG nova.network.neutron [req-9746c4ef-86e7-4b1b-a0d5-3a1f5487f3e4 req-7a1021f7-e05b-4306-acc5-6794166ccf12 service nova] [instance: aac5a363-5528-4d5f-8c90-6f9ad69a06dd] Updated VIF entry in instance network info cache for port 451630a7-eec6-4614-8737-498026e8d671. {{(pid=71474) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 21 14:04:01 user nova-compute[71474]: DEBUG nova.network.neutron [req-9746c4ef-86e7-4b1b-a0d5-3a1f5487f3e4 req-7a1021f7-e05b-4306-acc5-6794166ccf12 service nova] [instance: aac5a363-5528-4d5f-8c90-6f9ad69a06dd] Updating instance_info_cache with network_info: [{"id": "451630a7-eec6-4614-8737-498026e8d671", "address": "fa:16:3e:16:01:e5", "network": {"id": "31b07b9f-0a0f-426a-97d6-12b23e611818", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1809206062-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f0ccc2c950364fcbb0f2b1cc937f6a82", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap451630a7-ee", "ovs_interfaceid": "451630a7-eec6-4614-8737-498026e8d671", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:04:01 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-9746c4ef-86e7-4b1b-a0d5-3a1f5487f3e4 req-7a1021f7-e05b-4306-acc5-6794166ccf12 service nova] Releasing lock "refresh_cache-aac5a363-5528-4d5f-8c90-6f9ad69a06dd" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 14:04:02 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:04:02 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:04:02 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:04:02 user nova-compute[71474]: DEBUG nova.compute.manager [req-57dc499d-a9a1-4c2c-8181-d9397c04b83d req-6fc2c606-655c-48f8-af1d-deae31032293 service nova] [instance: aac5a363-5528-4d5f-8c90-6f9ad69a06dd] Received event network-vif-plugged-451630a7-eec6-4614-8737-498026e8d671 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:04:02 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-57dc499d-a9a1-4c2c-8181-d9397c04b83d req-6fc2c606-655c-48f8-af1d-deae31032293 service nova] Acquiring lock "aac5a363-5528-4d5f-8c90-6f9ad69a06dd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:04:02 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-57dc499d-a9a1-4c2c-8181-d9397c04b83d req-6fc2c606-655c-48f8-af1d-deae31032293 service nova] Lock "aac5a363-5528-4d5f-8c90-6f9ad69a06dd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:04:02 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-57dc499d-a9a1-4c2c-8181-d9397c04b83d req-6fc2c606-655c-48f8-af1d-deae31032293 service nova] Lock "aac5a363-5528-4d5f-8c90-6f9ad69a06dd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:04:02 user nova-compute[71474]: DEBUG nova.compute.manager [req-57dc499d-a9a1-4c2c-8181-d9397c04b83d req-6fc2c606-655c-48f8-af1d-deae31032293 service nova] [instance: aac5a363-5528-4d5f-8c90-6f9ad69a06dd] No waiting events found dispatching network-vif-plugged-451630a7-eec6-4614-8737-498026e8d671 {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:04:02 user nova-compute[71474]: WARNING nova.compute.manager [req-57dc499d-a9a1-4c2c-8181-d9397c04b83d req-6fc2c606-655c-48f8-af1d-deae31032293 service nova] [instance: aac5a363-5528-4d5f-8c90-6f9ad69a06dd] Received unexpected event network-vif-plugged-451630a7-eec6-4614-8737-498026e8d671 for instance with vm_state building and task_state spawning. Apr 21 14:04:03 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:04:03 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:04:03 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:04:04 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:04:04 user nova-compute[71474]: DEBUG nova.compute.manager [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: aac5a363-5528-4d5f-8c90-6f9ad69a06dd] Instance event wait completed in 0 seconds for {{(pid=71474) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 21 14:04:04 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: aac5a363-5528-4d5f-8c90-6f9ad69a06dd] Guest created on hypervisor {{(pid=71474) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 21 14:04:04 user nova-compute[71474]: DEBUG nova.virt.driver [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Emitting event Resumed> {{(pid=71474) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 14:04:04 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: aac5a363-5528-4d5f-8c90-6f9ad69a06dd] VM Resumed (Lifecycle Event) Apr 21 14:04:04 user nova-compute[71474]: INFO nova.virt.libvirt.driver [-] [instance: aac5a363-5528-4d5f-8c90-6f9ad69a06dd] Instance spawned successfully. Apr 21 14:04:04 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: aac5a363-5528-4d5f-8c90-6f9ad69a06dd] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 21 14:04:04 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: aac5a363-5528-4d5f-8c90-6f9ad69a06dd] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 14:04:04 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: aac5a363-5528-4d5f-8c90-6f9ad69a06dd] Found default for hw_cdrom_bus of ide {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:04:04 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: aac5a363-5528-4d5f-8c90-6f9ad69a06dd] Found default for hw_disk_bus of virtio {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:04:04 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: aac5a363-5528-4d5f-8c90-6f9ad69a06dd] Found default for hw_input_bus of None {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:04:04 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: aac5a363-5528-4d5f-8c90-6f9ad69a06dd] Found default for hw_pointer_model of None {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:04:04 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: aac5a363-5528-4d5f-8c90-6f9ad69a06dd] Found default for hw_video_model of virtio {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:04:04 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: aac5a363-5528-4d5f-8c90-6f9ad69a06dd] Found default for hw_vif_model of virtio {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:04:04 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: aac5a363-5528-4d5f-8c90-6f9ad69a06dd] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71474) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 14:04:04 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: aac5a363-5528-4d5f-8c90-6f9ad69a06dd] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 14:04:04 user nova-compute[71474]: DEBUG nova.virt.driver [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Emitting event Started> {{(pid=71474) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 14:04:04 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: aac5a363-5528-4d5f-8c90-6f9ad69a06dd] VM Started (Lifecycle Event) Apr 21 14:04:04 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: aac5a363-5528-4d5f-8c90-6f9ad69a06dd] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 14:04:04 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: aac5a363-5528-4d5f-8c90-6f9ad69a06dd] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71474) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 14:04:04 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: aac5a363-5528-4d5f-8c90-6f9ad69a06dd] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 14:04:04 user nova-compute[71474]: INFO nova.compute.manager [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: aac5a363-5528-4d5f-8c90-6f9ad69a06dd] Took 5.54 seconds to spawn the instance on the hypervisor. Apr 21 14:04:04 user nova-compute[71474]: DEBUG nova.compute.manager [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: aac5a363-5528-4d5f-8c90-6f9ad69a06dd] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 14:04:04 user nova-compute[71474]: INFO nova.compute.manager [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: aac5a363-5528-4d5f-8c90-6f9ad69a06dd] Took 6.18 seconds to build instance. Apr 21 14:04:04 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-be5b22e5-55c3-482e-965b-a7738789623b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Lock "aac5a363-5528-4d5f-8c90-6f9ad69a06dd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.285s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:04:04 user nova-compute[71474]: DEBUG nova.compute.manager [req-5abbc397-51b0-49cc-a796-fb8e9b3f3bb5 req-be51dae0-827f-4070-901a-aeac2887882d service nova] [instance: aac5a363-5528-4d5f-8c90-6f9ad69a06dd] Received event network-vif-plugged-451630a7-eec6-4614-8737-498026e8d671 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:04:04 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-5abbc397-51b0-49cc-a796-fb8e9b3f3bb5 req-be51dae0-827f-4070-901a-aeac2887882d service nova] Acquiring lock "aac5a363-5528-4d5f-8c90-6f9ad69a06dd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:04:04 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-5abbc397-51b0-49cc-a796-fb8e9b3f3bb5 req-be51dae0-827f-4070-901a-aeac2887882d service nova] Lock "aac5a363-5528-4d5f-8c90-6f9ad69a06dd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:04:04 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-5abbc397-51b0-49cc-a796-fb8e9b3f3bb5 req-be51dae0-827f-4070-901a-aeac2887882d service nova] Lock "aac5a363-5528-4d5f-8c90-6f9ad69a06dd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:04:04 user nova-compute[71474]: DEBUG nova.compute.manager [req-5abbc397-51b0-49cc-a796-fb8e9b3f3bb5 req-be51dae0-827f-4070-901a-aeac2887882d service nova] [instance: aac5a363-5528-4d5f-8c90-6f9ad69a06dd] No waiting events found dispatching network-vif-plugged-451630a7-eec6-4614-8737-498026e8d671 {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:04:04 user nova-compute[71474]: WARNING nova.compute.manager [req-5abbc397-51b0-49cc-a796-fb8e9b3f3bb5 req-be51dae0-827f-4070-901a-aeac2887882d service nova] [instance: aac5a363-5528-4d5f-8c90-6f9ad69a06dd] Received unexpected event network-vif-plugged-451630a7-eec6-4614-8737-498026e8d671 for instance with vm_state active and task_state None. Apr 21 14:04:05 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:04:06 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:04:07 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:04:10 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:04:11 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:04:11 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:04:14 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:04:15 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:04:16 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:04:18 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:04:19 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:04:20 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:04:21 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:04:22 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:04:23 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:04:25 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:04:26 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:04:26 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:04:27 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:04:30 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:04:30 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:04:31 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:04:31 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:04:34 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Acquiring lock "5cf0c20f-ffda-4578-adae-9aaef4c4bd18" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:04:34 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Lock "5cf0c20f-ffda-4578-adae-9aaef4c4bd18" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:04:34 user nova-compute[71474]: DEBUG nova.compute.manager [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] [instance: 5cf0c20f-ffda-4578-adae-9aaef4c4bd18] Starting instance... {{(pid=71474) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 21 14:04:34 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:04:34 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:04:34 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71474) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 21 14:04:34 user nova-compute[71474]: INFO nova.compute.claims [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] [instance: 5cf0c20f-ffda-4578-adae-9aaef4c4bd18] Claim successful on node user Apr 21 14:04:35 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Inventory has not changed in ProviderTree for provider: 4e62c1ab-67bb-43ed-8389-61deb50e98d7 {{(pid=71474) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 14:04:35 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Inventory has not changed for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 14:04:35 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.397s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:04:35 user nova-compute[71474]: DEBUG nova.compute.manager [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] [instance: 5cf0c20f-ffda-4578-adae-9aaef4c4bd18] Start building networks asynchronously for instance. {{(pid=71474) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 21 14:04:35 user nova-compute[71474]: DEBUG nova.compute.manager [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] [instance: 5cf0c20f-ffda-4578-adae-9aaef4c4bd18] Allocating IP information in the background. {{(pid=71474) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 21 14:04:35 user nova-compute[71474]: DEBUG nova.network.neutron [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] [instance: 5cf0c20f-ffda-4578-adae-9aaef4c4bd18] allocate_for_instance() {{(pid=71474) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 21 14:04:35 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] [instance: 5cf0c20f-ffda-4578-adae-9aaef4c4bd18] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 21 14:04:35 user nova-compute[71474]: DEBUG nova.compute.manager [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] [instance: 5cf0c20f-ffda-4578-adae-9aaef4c4bd18] Start building block device mappings for instance. {{(pid=71474) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 21 14:04:35 user nova-compute[71474]: DEBUG nova.policy [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9d40cdc3312b43d286d8a79cde9f5418', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'cfa1f4e6f7864477b911420ea2ecb982', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71474) authorize /opt/stack/nova/nova/policy.py:203}} Apr 21 14:04:35 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:04:35 user nova-compute[71474]: DEBUG nova.compute.manager [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] [instance: 5cf0c20f-ffda-4578-adae-9aaef4c4bd18] Start spawning the instance on the hypervisor. {{(pid=71474) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 21 14:04:35 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] [instance: 5cf0c20f-ffda-4578-adae-9aaef4c4bd18] Creating instance directory {{(pid=71474) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 21 14:04:35 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] [instance: 5cf0c20f-ffda-4578-adae-9aaef4c4bd18] Creating image(s) Apr 21 14:04:35 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Acquiring lock "/opt/stack/data/nova/instances/5cf0c20f-ffda-4578-adae-9aaef4c4bd18/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:04:35 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Lock "/opt/stack/data/nova/instances/5cf0c20f-ffda-4578-adae-9aaef4c4bd18/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:04:35 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Lock "/opt/stack/data/nova/instances/5cf0c20f-ffda-4578-adae-9aaef4c4bd18/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:04:35 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Acquiring lock "7c331e3f1b40931e7a558125bffae1edcb9a9522" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:04:35 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Lock "7c331e3f1b40931e7a558125bffae1edcb9a9522" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:04:35 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7c331e3f1b40931e7a558125bffae1edcb9a9522.part --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:04:35 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7c331e3f1b40931e7a558125bffae1edcb9a9522.part --force-share --output=json" returned: 0 in 0.141s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:04:35 user nova-compute[71474]: DEBUG nova.virt.images [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] a08ba1b8-74ec-4c3c-9d31-0cd58b006bb0 was qcow2, converting to raw {{(pid=71474) fetch_to_raw /opt/stack/nova/nova/virt/images.py:165}} Apr 21 14:04:35 user nova-compute[71474]: DEBUG nova.privsep.utils [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=71474) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Apr 21 14:04:35 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/7c331e3f1b40931e7a558125bffae1edcb9a9522.part /opt/stack/data/nova/instances/_base/7c331e3f1b40931e7a558125bffae1edcb9a9522.converted {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:04:35 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:04:36 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:04:36 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] CMD "qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/7c331e3f1b40931e7a558125bffae1edcb9a9522.part /opt/stack/data/nova/instances/_base/7c331e3f1b40931e7a558125bffae1edcb9a9522.converted" returned: 0 in 0.490s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:04:36 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7c331e3f1b40931e7a558125bffae1edcb9a9522.converted --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:04:36 user nova-compute[71474]: DEBUG nova.network.neutron [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] [instance: 5cf0c20f-ffda-4578-adae-9aaef4c4bd18] Successfully created port: c4696818-c28f-4798-a5c4-1a4b64a5a79f {{(pid=71474) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 21 14:04:36 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7c331e3f1b40931e7a558125bffae1edcb9a9522.converted --force-share --output=json" returned: 0 in 0.142s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:04:36 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Lock "7c331e3f1b40931e7a558125bffae1edcb9a9522" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 1.097s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:04:36 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7c331e3f1b40931e7a558125bffae1edcb9a9522 --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:04:36 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7c331e3f1b40931e7a558125bffae1edcb9a9522 --force-share --output=json" returned: 0 in 0.142s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:04:36 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Acquiring lock "7c331e3f1b40931e7a558125bffae1edcb9a9522" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:04:36 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Lock "7c331e3f1b40931e7a558125bffae1edcb9a9522" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:04:36 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7c331e3f1b40931e7a558125bffae1edcb9a9522 --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:04:36 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7c331e3f1b40931e7a558125bffae1edcb9a9522 --force-share --output=json" returned: 0 in 0.138s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:04:36 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7c331e3f1b40931e7a558125bffae1edcb9a9522,backing_fmt=raw /opt/stack/data/nova/instances/5cf0c20f-ffda-4578-adae-9aaef4c4bd18/disk 1073741824 {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:04:36 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/7c331e3f1b40931e7a558125bffae1edcb9a9522,backing_fmt=raw /opt/stack/data/nova/instances/5cf0c20f-ffda-4578-adae-9aaef4c4bd18/disk 1073741824" returned: 0 in 0.056s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:04:36 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Lock "7c331e3f1b40931e7a558125bffae1edcb9a9522" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.201s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:04:36 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7c331e3f1b40931e7a558125bffae1edcb9a9522 --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:04:37 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/7c331e3f1b40931e7a558125bffae1edcb9a9522 --force-share --output=json" returned: 0 in 0.141s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:04:37 user nova-compute[71474]: DEBUG nova.virt.disk.api [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Checking if we can resize image /opt/stack/data/nova/instances/5cf0c20f-ffda-4578-adae-9aaef4c4bd18/disk. size=1073741824 {{(pid=71474) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 21 14:04:37 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5cf0c20f-ffda-4578-adae-9aaef4c4bd18/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:04:37 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5cf0c20f-ffda-4578-adae-9aaef4c4bd18/disk --force-share --output=json" returned: 0 in 0.147s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:04:37 user nova-compute[71474]: DEBUG nova.virt.disk.api [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Cannot resize image /opt/stack/data/nova/instances/5cf0c20f-ffda-4578-adae-9aaef4c4bd18/disk to a smaller size. {{(pid=71474) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 21 14:04:37 user nova-compute[71474]: DEBUG nova.objects.instance [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Lazy-loading 'migration_context' on Instance uuid 5cf0c20f-ffda-4578-adae-9aaef4c4bd18 {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 14:04:37 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] [instance: 5cf0c20f-ffda-4578-adae-9aaef4c4bd18] Created local disks {{(pid=71474) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 21 14:04:37 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] [instance: 5cf0c20f-ffda-4578-adae-9aaef4c4bd18] Ensure instance console log exists: /opt/stack/data/nova/instances/5cf0c20f-ffda-4578-adae-9aaef4c4bd18/console.log {{(pid=71474) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 21 14:04:37 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:04:37 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:04:37 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:04:37 user nova-compute[71474]: DEBUG nova.network.neutron [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] [instance: 5cf0c20f-ffda-4578-adae-9aaef4c4bd18] Successfully updated port: c4696818-c28f-4798-a5c4-1a4b64a5a79f {{(pid=71474) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 21 14:04:37 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Acquiring lock "refresh_cache-5cf0c20f-ffda-4578-adae-9aaef4c4bd18" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 14:04:37 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Acquired lock "refresh_cache-5cf0c20f-ffda-4578-adae-9aaef4c4bd18" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 14:04:37 user nova-compute[71474]: DEBUG nova.network.neutron [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] [instance: 5cf0c20f-ffda-4578-adae-9aaef4c4bd18] Building network info cache for instance {{(pid=71474) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 21 14:04:37 user nova-compute[71474]: DEBUG nova.compute.manager [req-4c0d29fb-99d8-46e6-987a-83ed2f08602e req-a8955481-b1fd-410d-9d6a-0230ed31448f service nova] [instance: 5cf0c20f-ffda-4578-adae-9aaef4c4bd18] Received event network-changed-c4696818-c28f-4798-a5c4-1a4b64a5a79f {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:04:37 user nova-compute[71474]: DEBUG nova.compute.manager [req-4c0d29fb-99d8-46e6-987a-83ed2f08602e req-a8955481-b1fd-410d-9d6a-0230ed31448f service nova] [instance: 5cf0c20f-ffda-4578-adae-9aaef4c4bd18] Refreshing instance network info cache due to event network-changed-c4696818-c28f-4798-a5c4-1a4b64a5a79f. {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 21 14:04:37 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-4c0d29fb-99d8-46e6-987a-83ed2f08602e req-a8955481-b1fd-410d-9d6a-0230ed31448f service nova] Acquiring lock "refresh_cache-5cf0c20f-ffda-4578-adae-9aaef4c4bd18" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 14:04:37 user nova-compute[71474]: DEBUG nova.compute.manager [None req-f1cfe12d-45a5-47b1-8688-6dae102a1305 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 14:04:37 user nova-compute[71474]: DEBUG nova.network.neutron [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] [instance: 5cf0c20f-ffda-4578-adae-9aaef4c4bd18] Instance cache missing network info. {{(pid=71474) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 21 14:04:37 user nova-compute[71474]: INFO nova.compute.manager [None req-f1cfe12d-45a5-47b1-8688-6dae102a1305 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] instance snapshotting Apr 21 14:04:37 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-f1cfe12d-45a5-47b1-8688-6dae102a1305 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] Beginning live snapshot process Apr 21 14:04:37 user nova-compute[71474]: DEBUG nova.network.neutron [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] [instance: 5cf0c20f-ffda-4578-adae-9aaef4c4bd18] Updating instance_info_cache with network_info: [{"id": "c4696818-c28f-4798-a5c4-1a4b64a5a79f", "address": "fa:16:3e:97:2a:26", "network": {"id": "12b23d1e-f3a6-4c34-989d-1c89ee946e24", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1972551477-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cfa1f4e6f7864477b911420ea2ecb982", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapc4696818-c2", "ovs_interfaceid": "c4696818-c28f-4798-a5c4-1a4b64a5a79f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:04:37 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Releasing lock "refresh_cache-5cf0c20f-ffda-4578-adae-9aaef4c4bd18" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 14:04:37 user nova-compute[71474]: DEBUG nova.compute.manager [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] [instance: 5cf0c20f-ffda-4578-adae-9aaef4c4bd18] Instance network_info: |[{"id": "c4696818-c28f-4798-a5c4-1a4b64a5a79f", "address": "fa:16:3e:97:2a:26", "network": {"id": "12b23d1e-f3a6-4c34-989d-1c89ee946e24", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1972551477-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cfa1f4e6f7864477b911420ea2ecb982", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapc4696818-c2", "ovs_interfaceid": "c4696818-c28f-4798-a5c4-1a4b64a5a79f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71474) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 21 14:04:37 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-4c0d29fb-99d8-46e6-987a-83ed2f08602e req-a8955481-b1fd-410d-9d6a-0230ed31448f service nova] Acquired lock "refresh_cache-5cf0c20f-ffda-4578-adae-9aaef4c4bd18" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 14:04:37 user nova-compute[71474]: DEBUG nova.network.neutron [req-4c0d29fb-99d8-46e6-987a-83ed2f08602e req-a8955481-b1fd-410d-9d6a-0230ed31448f service nova] [instance: 5cf0c20f-ffda-4578-adae-9aaef4c4bd18] Refreshing network info cache for port c4696818-c28f-4798-a5c4-1a4b64a5a79f {{(pid=71474) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 21 14:04:37 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] [instance: 5cf0c20f-ffda-4578-adae-9aaef4c4bd18] Start _get_guest_xml network_info=[{"id": "c4696818-c28f-4798-a5c4-1a4b64a5a79f", "address": "fa:16:3e:97:2a:26", "network": {"id": "12b23d1e-f3a6-4c34-989d-1c89ee946e24", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1972551477-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cfa1f4e6f7864477b911420ea2ecb982", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapc4696818-c2", "ovs_interfaceid": "c4696818-c28f-4798-a5c4-1a4b64a5a79f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T14:04:31Z,direct_url=,disk_format='qcow2',id=a08ba1b8-74ec-4c3c-9d31-0cd58b006bb0,min_disk=0,min_ram=0,name='tempest-scenario-img--610996331',owner='cfa1f4e6f7864477b911420ea2ecb982',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T14:04:32Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'guest_format': None, 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'image_id': 'a08ba1b8-74ec-4c3c-9d31-0cd58b006bb0'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71474) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 21 14:04:37 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 14:04:37 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 14:04:37 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71474) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 21 14:04:37 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-21T13:55:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T14:04:31Z,direct_url=,disk_format='qcow2',id=a08ba1b8-74ec-4c3c-9d31-0cd58b006bb0,min_disk=0,min_ram=0,name='tempest-scenario-img--610996331',owner='cfa1f4e6f7864477b911420ea2ecb982',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T14:04:32Z,virtual_size=,visibility=), allow threads: True {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 21 14:04:37 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Flavor limits 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 21 14:04:37 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Image limits 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 21 14:04:37 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Flavor pref 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 21 14:04:37 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Image pref 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 21 14:04:37 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 21 14:04:37 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 21 14:04:37 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71474) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 21 14:04:37 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Got 1 possible topologies {{(pid=71474) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 21 14:04:37 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 21 14:04:37 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 21 14:04:38 user nova-compute[71474]: DEBUG nova.virt.libvirt.vif [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T14:04:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestMinimumBasicScenario-server-478405333',display_name='tempest-TestMinimumBasicScenario-server-478405333',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-testminimumbasicscenario-server-478405333',id=19,image_ref='a08ba1b8-74ec-4c3c-9d31-0cd58b006bb0',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBy5F7/eY34kH4kknTNvubOiNWmdv324rEiVr8ZZ6u/8wGu10U4U/vV+TgZkfkWQO0m1rbrGO251QOQqyVSfRO8QzK8Jq0lU+/cWevf7A1waDImolju4hBpvNELhkWYjog==',key_name='tempest-TestMinimumBasicScenario-414912251',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cfa1f4e6f7864477b911420ea2ecb982',ramdisk_id='',reservation_id='r-wkygfj0w',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a08ba1b8-74ec-4c3c-9d31-0cd58b006bb0',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestMinimumBasicScenario-515927679',owner_user_name='tempest-TestMinimumBasicScenario-515927679-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T14:04:35Z,user_data=None,user_id='9d40cdc3312b43d286d8a79cde9f5418',uuid=5cf0c20f-ffda-4578-adae-9aaef4c4bd18,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c4696818-c28f-4798-a5c4-1a4b64a5a79f", "address": "fa:16:3e:97:2a:26", "network": {"id": "12b23d1e-f3a6-4c34-989d-1c89ee946e24", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1972551477-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cfa1f4e6f7864477b911420ea2ecb982", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapc4696818-c2", "ovs_interfaceid": "c4696818-c28f-4798-a5c4-1a4b64a5a79f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71474) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 21 14:04:38 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Converting VIF {"id": "c4696818-c28f-4798-a5c4-1a4b64a5a79f", "address": "fa:16:3e:97:2a:26", "network": {"id": "12b23d1e-f3a6-4c34-989d-1c89ee946e24", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1972551477-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cfa1f4e6f7864477b911420ea2ecb982", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapc4696818-c2", "ovs_interfaceid": "c4696818-c28f-4798-a5c4-1a4b64a5a79f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 14:04:38 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:97:2a:26,bridge_name='br-int',has_traffic_filtering=True,id=c4696818-c28f-4798-a5c4-1a4b64a5a79f,network=Network(12b23d1e-f3a6-4c34-989d-1c89ee946e24),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc4696818-c2') {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 14:04:38 user nova-compute[71474]: DEBUG nova.objects.instance [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Lazy-loading 'pci_devices' on Instance uuid 5cf0c20f-ffda-4578-adae-9aaef4c4bd18 {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 14:04:38 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-f1cfe12d-45a5-47b1-8688-6dae102a1305 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4a44d9f3-28b2-45e7-b952-2bb1735ef5b5/disk --force-share --output=json -f qcow2 {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:04:38 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] [instance: 5cf0c20f-ffda-4578-adae-9aaef4c4bd18] End _get_guest_xml xml= Apr 21 14:04:38 user nova-compute[71474]: 5cf0c20f-ffda-4578-adae-9aaef4c4bd18 Apr 21 14:04:38 user nova-compute[71474]: instance-00000013 Apr 21 14:04:38 user nova-compute[71474]: 131072 Apr 21 14:04:38 user nova-compute[71474]: 1 Apr 21 14:04:38 user nova-compute[71474]: Apr 21 14:04:38 user nova-compute[71474]: Apr 21 14:04:38 user nova-compute[71474]: Apr 21 14:04:38 user nova-compute[71474]: tempest-TestMinimumBasicScenario-server-478405333 Apr 21 14:04:38 user nova-compute[71474]: 2023-04-21 14:04:37 Apr 21 14:04:38 user nova-compute[71474]: Apr 21 14:04:38 user nova-compute[71474]: 128 Apr 21 14:04:38 user nova-compute[71474]: 1 Apr 21 14:04:38 user nova-compute[71474]: 0 Apr 21 14:04:38 user nova-compute[71474]: 0 Apr 21 14:04:38 user nova-compute[71474]: 1 Apr 21 14:04:38 user nova-compute[71474]: Apr 21 14:04:38 user nova-compute[71474]: Apr 21 14:04:38 user nova-compute[71474]: tempest-TestMinimumBasicScenario-515927679-project-member Apr 21 14:04:38 user nova-compute[71474]: tempest-TestMinimumBasicScenario-515927679 Apr 21 14:04:38 user nova-compute[71474]: Apr 21 14:04:38 user nova-compute[71474]: Apr 21 14:04:38 user nova-compute[71474]: Apr 21 14:04:38 user nova-compute[71474]: Apr 21 14:04:38 user nova-compute[71474]: Apr 21 14:04:38 user nova-compute[71474]: Apr 21 14:04:38 user nova-compute[71474]: Apr 21 14:04:38 user nova-compute[71474]: Apr 21 14:04:38 user nova-compute[71474]: Apr 21 14:04:38 user nova-compute[71474]: Apr 21 14:04:38 user nova-compute[71474]: Apr 21 14:04:38 user nova-compute[71474]: OpenStack Foundation Apr 21 14:04:38 user nova-compute[71474]: OpenStack Nova Apr 21 14:04:38 user nova-compute[71474]: 0.0.0 Apr 21 14:04:38 user nova-compute[71474]: 5cf0c20f-ffda-4578-adae-9aaef4c4bd18 Apr 21 14:04:38 user nova-compute[71474]: 5cf0c20f-ffda-4578-adae-9aaef4c4bd18 Apr 21 14:04:38 user nova-compute[71474]: Virtual Machine Apr 21 14:04:38 user nova-compute[71474]: Apr 21 14:04:38 user nova-compute[71474]: Apr 21 14:04:38 user nova-compute[71474]: Apr 21 14:04:38 user nova-compute[71474]: hvm Apr 21 14:04:38 user nova-compute[71474]: Apr 21 14:04:38 user nova-compute[71474]: Apr 21 14:04:38 user nova-compute[71474]: Apr 21 14:04:38 user nova-compute[71474]: Apr 21 14:04:38 user nova-compute[71474]: Apr 21 14:04:38 user nova-compute[71474]: Apr 21 14:04:38 user nova-compute[71474]: Apr 21 14:04:38 user nova-compute[71474]: Apr 21 14:04:38 user nova-compute[71474]: Apr 21 14:04:38 user nova-compute[71474]: Apr 21 14:04:38 user nova-compute[71474]: Apr 21 14:04:38 user nova-compute[71474]: Apr 21 14:04:38 user nova-compute[71474]: Apr 21 14:04:38 user nova-compute[71474]: Apr 21 14:04:38 user nova-compute[71474]: Nehalem Apr 21 14:04:38 user nova-compute[71474]: Apr 21 14:04:38 user nova-compute[71474]: Apr 21 14:04:38 user nova-compute[71474]: Apr 21 14:04:38 user nova-compute[71474]: Apr 21 14:04:38 user nova-compute[71474]: Apr 21 14:04:38 user nova-compute[71474]: Apr 21 14:04:38 user nova-compute[71474]: Apr 21 14:04:38 user nova-compute[71474]: Apr 21 14:04:38 user nova-compute[71474]: Apr 21 14:04:38 user nova-compute[71474]: Apr 21 14:04:38 user nova-compute[71474]: Apr 21 14:04:38 user nova-compute[71474]: Apr 21 14:04:38 user nova-compute[71474]: Apr 21 14:04:38 user nova-compute[71474]: Apr 21 14:04:38 user nova-compute[71474]: Apr 21 14:04:38 user nova-compute[71474]: Apr 21 14:04:38 user nova-compute[71474]: Apr 21 14:04:38 user nova-compute[71474]: Apr 21 14:04:38 user nova-compute[71474]: Apr 21 14:04:38 user nova-compute[71474]: Apr 21 14:04:38 user nova-compute[71474]: /dev/urandom Apr 21 14:04:38 user nova-compute[71474]: Apr 21 14:04:38 user nova-compute[71474]: Apr 21 14:04:38 user nova-compute[71474]: Apr 21 14:04:38 user nova-compute[71474]: Apr 21 14:04:38 user nova-compute[71474]: Apr 21 14:04:38 user nova-compute[71474]: Apr 21 14:04:38 user nova-compute[71474]: Apr 21 14:04:38 user nova-compute[71474]: {{(pid=71474) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 21 14:04:38 user nova-compute[71474]: DEBUG nova.virt.libvirt.vif [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T14:04:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestMinimumBasicScenario-server-478405333',display_name='tempest-TestMinimumBasicScenario-server-478405333',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-testminimumbasicscenario-server-478405333',id=19,image_ref='a08ba1b8-74ec-4c3c-9d31-0cd58b006bb0',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBy5F7/eY34kH4kknTNvubOiNWmdv324rEiVr8ZZ6u/8wGu10U4U/vV+TgZkfkWQO0m1rbrGO251QOQqyVSfRO8QzK8Jq0lU+/cWevf7A1waDImolju4hBpvNELhkWYjog==',key_name='tempest-TestMinimumBasicScenario-414912251',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cfa1f4e6f7864477b911420ea2ecb982',ramdisk_id='',reservation_id='r-wkygfj0w',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a08ba1b8-74ec-4c3c-9d31-0cd58b006bb0',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestMinimumBasicScenario-515927679',owner_user_name='tempest-TestMinimumBasicScenario-515927679-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T14:04:35Z,user_data=None,user_id='9d40cdc3312b43d286d8a79cde9f5418',uuid=5cf0c20f-ffda-4578-adae-9aaef4c4bd18,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c4696818-c28f-4798-a5c4-1a4b64a5a79f", "address": "fa:16:3e:97:2a:26", "network": {"id": "12b23d1e-f3a6-4c34-989d-1c89ee946e24", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1972551477-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cfa1f4e6f7864477b911420ea2ecb982", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapc4696818-c2", "ovs_interfaceid": "c4696818-c28f-4798-a5c4-1a4b64a5a79f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 21 14:04:38 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Converting VIF {"id": "c4696818-c28f-4798-a5c4-1a4b64a5a79f", "address": "fa:16:3e:97:2a:26", "network": {"id": "12b23d1e-f3a6-4c34-989d-1c89ee946e24", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1972551477-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cfa1f4e6f7864477b911420ea2ecb982", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapc4696818-c2", "ovs_interfaceid": "c4696818-c28f-4798-a5c4-1a4b64a5a79f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 14:04:38 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:97:2a:26,bridge_name='br-int',has_traffic_filtering=True,id=c4696818-c28f-4798-a5c4-1a4b64a5a79f,network=Network(12b23d1e-f3a6-4c34-989d-1c89ee946e24),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc4696818-c2') {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 14:04:38 user nova-compute[71474]: DEBUG os_vif [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:97:2a:26,bridge_name='br-int',has_traffic_filtering=True,id=c4696818-c28f-4798-a5c4-1a4b64a5a79f,network=Network(12b23d1e-f3a6-4c34-989d-1c89ee946e24),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc4696818-c2') {{(pid=71474) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 21 14:04:38 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:04:38 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 14:04:38 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 21 14:04:38 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:04:38 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc4696818-c2, may_exist=True) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 14:04:38 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc4696818-c2, col_values=(('external_ids', {'iface-id': 'c4696818-c28f-4798-a5c4-1a4b64a5a79f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:97:2a:26', 'vm-uuid': '5cf0c20f-ffda-4578-adae-9aaef4c4bd18'}),)) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 14:04:38 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:04:38 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 14:04:38 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:04:38 user nova-compute[71474]: INFO os_vif [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:97:2a:26,bridge_name='br-int',has_traffic_filtering=True,id=c4696818-c28f-4798-a5c4-1a4b64a5a79f,network=Network(12b23d1e-f3a6-4c34-989d-1c89ee946e24),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc4696818-c2') Apr 21 14:04:38 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-f1cfe12d-45a5-47b1-8688-6dae102a1305 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4a44d9f3-28b2-45e7-b952-2bb1735ef5b5/disk --force-share --output=json -f qcow2" returned: 0 in 0.145s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:04:38 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-f1cfe12d-45a5-47b1-8688-6dae102a1305 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4a44d9f3-28b2-45e7-b952-2bb1735ef5b5/disk --force-share --output=json -f qcow2 {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:04:38 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] No BDM found with device name vda, not building metadata. {{(pid=71474) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 21 14:04:38 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] No VIF found with MAC fa:16:3e:97:2a:26, not building metadata {{(pid=71474) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 21 14:04:38 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-f1cfe12d-45a5-47b1-8688-6dae102a1305 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4a44d9f3-28b2-45e7-b952-2bb1735ef5b5/disk --force-share --output=json -f qcow2" returned: 0 in 0.139s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:04:38 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-f1cfe12d-45a5-47b1-8688-6dae102a1305 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:04:38 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-f1cfe12d-45a5-47b1-8688-6dae102a1305 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json" returned: 0 in 0.145s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:04:38 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-f1cfe12d-45a5-47b1-8688-6dae102a1305 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7,backing_fmt=raw /opt/stack/data/nova/instances/snapshots/tmp_q4dek_l/f1b211030927475a9a548613ba9ab4ac.delta 1073741824 {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:04:38 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-f1cfe12d-45a5-47b1-8688-6dae102a1305 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7,backing_fmt=raw /opt/stack/data/nova/instances/snapshots/tmp_q4dek_l/f1b211030927475a9a548613ba9ab4ac.delta 1073741824" returned: 0 in 0.062s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:04:38 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-f1cfe12d-45a5-47b1-8688-6dae102a1305 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] Quiescing instance not available: QEMU guest agent is not enabled. Apr 21 14:04:38 user nova-compute[71474]: DEBUG nova.network.neutron [req-4c0d29fb-99d8-46e6-987a-83ed2f08602e req-a8955481-b1fd-410d-9d6a-0230ed31448f service nova] [instance: 5cf0c20f-ffda-4578-adae-9aaef4c4bd18] Updated VIF entry in instance network info cache for port c4696818-c28f-4798-a5c4-1a4b64a5a79f. {{(pid=71474) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 21 14:04:38 user nova-compute[71474]: DEBUG nova.network.neutron [req-4c0d29fb-99d8-46e6-987a-83ed2f08602e req-a8955481-b1fd-410d-9d6a-0230ed31448f service nova] [instance: 5cf0c20f-ffda-4578-adae-9aaef4c4bd18] Updating instance_info_cache with network_info: [{"id": "c4696818-c28f-4798-a5c4-1a4b64a5a79f", "address": "fa:16:3e:97:2a:26", "network": {"id": "12b23d1e-f3a6-4c34-989d-1c89ee946e24", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1972551477-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cfa1f4e6f7864477b911420ea2ecb982", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapc4696818-c2", "ovs_interfaceid": "c4696818-c28f-4798-a5c4-1a4b64a5a79f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:04:38 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-4c0d29fb-99d8-46e6-987a-83ed2f08602e req-a8955481-b1fd-410d-9d6a-0230ed31448f service nova] Releasing lock "refresh_cache-5cf0c20f-ffda-4578-adae-9aaef4c4bd18" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 14:04:39 user nova-compute[71474]: DEBUG nova.virt.libvirt.guest [None req-f1cfe12d-45a5-47b1-8688-6dae102a1305 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] COPY block job progress, current cursor: 0 final cursor: 43778048 {{(pid=71474) is_job_complete /opt/stack/nova/nova/virt/libvirt/guest.py:846}} Apr 21 14:04:39 user nova-compute[71474]: DEBUG nova.virt.libvirt.guest [None req-f1cfe12d-45a5-47b1-8688-6dae102a1305 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] COPY block job progress, current cursor: 43778048 final cursor: 43778048 {{(pid=71474) is_job_complete /opt/stack/nova/nova/virt/libvirt/guest.py:846}} Apr 21 14:04:39 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-f1cfe12d-45a5-47b1-8688-6dae102a1305 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] Skipping quiescing instance: QEMU guest agent is not enabled. Apr 21 14:04:39 user nova-compute[71474]: DEBUG nova.privsep.utils [None req-f1cfe12d-45a5-47b1-8688-6dae102a1305 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=71474) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Apr 21 14:04:39 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-f1cfe12d-45a5-47b1-8688-6dae102a1305 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /opt/stack/data/nova/instances/snapshots/tmp_q4dek_l/f1b211030927475a9a548613ba9ab4ac.delta /opt/stack/data/nova/instances/snapshots/tmp_q4dek_l/f1b211030927475a9a548613ba9ab4ac {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:04:40 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:04:40 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-f1cfe12d-45a5-47b1-8688-6dae102a1305 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /opt/stack/data/nova/instances/snapshots/tmp_q4dek_l/f1b211030927475a9a548613ba9ab4ac.delta /opt/stack/data/nova/instances/snapshots/tmp_q4dek_l/f1b211030927475a9a548613ba9ab4ac" returned: 0 in 0.379s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:04:40 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-f1cfe12d-45a5-47b1-8688-6dae102a1305 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] Snapshot extracted, beginning image upload Apr 21 14:04:40 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:04:40 user nova-compute[71474]: DEBUG nova.compute.manager [req-38f25cad-d8ab-4b8d-b8ea-c02aaec81908 req-c25f45fe-a990-42b6-86a9-c06e793343f2 service nova] [instance: 5cf0c20f-ffda-4578-adae-9aaef4c4bd18] Received event network-vif-plugged-c4696818-c28f-4798-a5c4-1a4b64a5a79f {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:04:40 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-38f25cad-d8ab-4b8d-b8ea-c02aaec81908 req-c25f45fe-a990-42b6-86a9-c06e793343f2 service nova] Acquiring lock "5cf0c20f-ffda-4578-adae-9aaef4c4bd18-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:04:40 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-38f25cad-d8ab-4b8d-b8ea-c02aaec81908 req-c25f45fe-a990-42b6-86a9-c06e793343f2 service nova] Lock "5cf0c20f-ffda-4578-adae-9aaef4c4bd18-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:04:40 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-38f25cad-d8ab-4b8d-b8ea-c02aaec81908 req-c25f45fe-a990-42b6-86a9-c06e793343f2 service nova] Lock "5cf0c20f-ffda-4578-adae-9aaef4c4bd18-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:04:40 user nova-compute[71474]: DEBUG nova.compute.manager [req-38f25cad-d8ab-4b8d-b8ea-c02aaec81908 req-c25f45fe-a990-42b6-86a9-c06e793343f2 service nova] [instance: 5cf0c20f-ffda-4578-adae-9aaef4c4bd18] No waiting events found dispatching network-vif-plugged-c4696818-c28f-4798-a5c4-1a4b64a5a79f {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:04:40 user nova-compute[71474]: WARNING nova.compute.manager [req-38f25cad-d8ab-4b8d-b8ea-c02aaec81908 req-c25f45fe-a990-42b6-86a9-c06e793343f2 service nova] [instance: 5cf0c20f-ffda-4578-adae-9aaef4c4bd18] Received unexpected event network-vif-plugged-c4696818-c28f-4798-a5c4-1a4b64a5a79f for instance with vm_state building and task_state spawning. Apr 21 14:04:40 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:04:40 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:04:41 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:04:42 user nova-compute[71474]: DEBUG nova.compute.manager [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] [instance: 5cf0c20f-ffda-4578-adae-9aaef4c4bd18] Instance event wait completed in 0 seconds for {{(pid=71474) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 21 14:04:42 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] [instance: 5cf0c20f-ffda-4578-adae-9aaef4c4bd18] Guest created on hypervisor {{(pid=71474) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 21 14:04:42 user nova-compute[71474]: DEBUG nova.virt.driver [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Emitting event Resumed> {{(pid=71474) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 14:04:42 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 5cf0c20f-ffda-4578-adae-9aaef4c4bd18] VM Resumed (Lifecycle Event) Apr 21 14:04:42 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-f1cfe12d-45a5-47b1-8688-6dae102a1305 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] Snapshot image upload complete Apr 21 14:04:42 user nova-compute[71474]: INFO nova.compute.manager [None req-f1cfe12d-45a5-47b1-8688-6dae102a1305 tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] Took 5.27 seconds to snapshot the instance on the hypervisor. Apr 21 14:04:42 user nova-compute[71474]: DEBUG nova.compute.manager [req-caabe8f0-cd3d-4ed6-8824-c71ace2c6c4a req-988ff5c3-354e-41cd-8c7a-385bd1331283 service nova] [instance: 5cf0c20f-ffda-4578-adae-9aaef4c4bd18] Received event network-vif-plugged-c4696818-c28f-4798-a5c4-1a4b64a5a79f {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:04:42 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-caabe8f0-cd3d-4ed6-8824-c71ace2c6c4a req-988ff5c3-354e-41cd-8c7a-385bd1331283 service nova] Acquiring lock "5cf0c20f-ffda-4578-adae-9aaef4c4bd18-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:04:42 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-caabe8f0-cd3d-4ed6-8824-c71ace2c6c4a req-988ff5c3-354e-41cd-8c7a-385bd1331283 service nova] Lock "5cf0c20f-ffda-4578-adae-9aaef4c4bd18-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:04:42 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-caabe8f0-cd3d-4ed6-8824-c71ace2c6c4a req-988ff5c3-354e-41cd-8c7a-385bd1331283 service nova] Lock "5cf0c20f-ffda-4578-adae-9aaef4c4bd18-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:04:42 user nova-compute[71474]: DEBUG nova.compute.manager [req-caabe8f0-cd3d-4ed6-8824-c71ace2c6c4a req-988ff5c3-354e-41cd-8c7a-385bd1331283 service nova] [instance: 5cf0c20f-ffda-4578-adae-9aaef4c4bd18] No waiting events found dispatching network-vif-plugged-c4696818-c28f-4798-a5c4-1a4b64a5a79f {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:04:42 user nova-compute[71474]: WARNING nova.compute.manager [req-caabe8f0-cd3d-4ed6-8824-c71ace2c6c4a req-988ff5c3-354e-41cd-8c7a-385bd1331283 service nova] [instance: 5cf0c20f-ffda-4578-adae-9aaef4c4bd18] Received unexpected event network-vif-plugged-c4696818-c28f-4798-a5c4-1a4b64a5a79f for instance with vm_state building and task_state spawning. Apr 21 14:04:42 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 5cf0c20f-ffda-4578-adae-9aaef4c4bd18] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 14:04:42 user nova-compute[71474]: INFO nova.virt.libvirt.driver [-] [instance: 5cf0c20f-ffda-4578-adae-9aaef4c4bd18] Instance spawned successfully. Apr 21 14:04:42 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] [instance: 5cf0c20f-ffda-4578-adae-9aaef4c4bd18] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 21 14:04:42 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 5cf0c20f-ffda-4578-adae-9aaef4c4bd18] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71474) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 14:04:42 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] [instance: 5cf0c20f-ffda-4578-adae-9aaef4c4bd18] Found default for hw_cdrom_bus of ide {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:04:42 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] [instance: 5cf0c20f-ffda-4578-adae-9aaef4c4bd18] Found default for hw_disk_bus of virtio {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:04:42 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] [instance: 5cf0c20f-ffda-4578-adae-9aaef4c4bd18] Found default for hw_input_bus of None {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:04:42 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] [instance: 5cf0c20f-ffda-4578-adae-9aaef4c4bd18] Found default for hw_pointer_model of None {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:04:42 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] [instance: 5cf0c20f-ffda-4578-adae-9aaef4c4bd18] Found default for hw_video_model of virtio {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:04:42 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] [instance: 5cf0c20f-ffda-4578-adae-9aaef4c4bd18] Found default for hw_vif_model of virtio {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:04:42 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 5cf0c20f-ffda-4578-adae-9aaef4c4bd18] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 14:04:42 user nova-compute[71474]: DEBUG nova.virt.driver [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Emitting event Started> {{(pid=71474) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 14:04:42 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 5cf0c20f-ffda-4578-adae-9aaef4c4bd18] VM Started (Lifecycle Event) Apr 21 14:04:42 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 5cf0c20f-ffda-4578-adae-9aaef4c4bd18] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 14:04:42 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 5cf0c20f-ffda-4578-adae-9aaef4c4bd18] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71474) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 14:04:42 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 5cf0c20f-ffda-4578-adae-9aaef4c4bd18] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 14:04:42 user nova-compute[71474]: INFO nova.compute.manager [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] [instance: 5cf0c20f-ffda-4578-adae-9aaef4c4bd18] Took 7.31 seconds to spawn the instance on the hypervisor. Apr 21 14:04:42 user nova-compute[71474]: DEBUG nova.compute.manager [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] [instance: 5cf0c20f-ffda-4578-adae-9aaef4c4bd18] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 14:04:42 user nova-compute[71474]: INFO nova.compute.manager [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] [instance: 5cf0c20f-ffda-4578-adae-9aaef4c4bd18] Took 8.05 seconds to build instance. Apr 21 14:04:42 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-abd604e0-0326-4fc3-96ca-83b7773da2f3 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Lock "5cf0c20f-ffda-4578-adae-9aaef4c4bd18" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 8.151s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:04:43 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:04:43 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:04:45 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:04:45 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:04:48 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:04:48 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._sync_power_states {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:04:48 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Triggering sync for uuid 30068c4a-94ed-4b84-9178-0d554326fc68 {{(pid=71474) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10202}} Apr 21 14:04:48 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Triggering sync for uuid 5e502c4c-a46b-4670-acba-2fda2d05adf5 {{(pid=71474) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10202}} Apr 21 14:04:48 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Triggering sync for uuid 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5 {{(pid=71474) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10202}} Apr 21 14:04:48 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Triggering sync for uuid a205a2a4-c0de-4c5c-abc4-7b034070e014 {{(pid=71474) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10202}} Apr 21 14:04:48 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Triggering sync for uuid 80eb182f-948b-42d3-999b-339c5d615a73 {{(pid=71474) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10202}} Apr 21 14:04:48 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Triggering sync for uuid aac5a363-5528-4d5f-8c90-6f9ad69a06dd {{(pid=71474) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10202}} Apr 21 14:04:48 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Triggering sync for uuid 5cf0c20f-ffda-4578-adae-9aaef4c4bd18 {{(pid=71474) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10202}} Apr 21 14:04:48 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquiring lock "30068c4a-94ed-4b84-9178-0d554326fc68" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:04:48 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "30068c4a-94ed-4b84-9178-0d554326fc68" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:04:48 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquiring lock "5e502c4c-a46b-4670-acba-2fda2d05adf5" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:04:48 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "5e502c4c-a46b-4670-acba-2fda2d05adf5" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:04:48 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquiring lock "4a44d9f3-28b2-45e7-b952-2bb1735ef5b5" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:04:48 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "4a44d9f3-28b2-45e7-b952-2bb1735ef5b5" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:04:48 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquiring lock "a205a2a4-c0de-4c5c-abc4-7b034070e014" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:04:48 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "a205a2a4-c0de-4c5c-abc4-7b034070e014" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:04:48 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquiring lock "80eb182f-948b-42d3-999b-339c5d615a73" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:04:48 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "80eb182f-948b-42d3-999b-339c5d615a73" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:04:48 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquiring lock "aac5a363-5528-4d5f-8c90-6f9ad69a06dd" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:04:48 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "aac5a363-5528-4d5f-8c90-6f9ad69a06dd" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:04:48 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquiring lock "5cf0c20f-ffda-4578-adae-9aaef4c4bd18" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:04:48 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "5cf0c20f-ffda-4578-adae-9aaef4c4bd18" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:04:48 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "5e502c4c-a46b-4670-acba-2fda2d05adf5" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.055s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:04:48 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "30068c4a-94ed-4b84-9178-0d554326fc68" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.067s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:04:48 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "80eb182f-948b-42d3-999b-339c5d615a73" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.064s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:04:48 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "a205a2a4-c0de-4c5c-abc4-7b034070e014" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.074s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:04:48 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "5cf0c20f-ffda-4578-adae-9aaef4c4bd18" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.083s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:04:48 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "4a44d9f3-28b2-45e7-b952-2bb1735ef5b5" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.090s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:04:48 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "aac5a363-5528-4d5f-8c90-6f9ad69a06dd" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.095s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:04:48 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] Acquiring lock "b5e2e065-1b7d-4cbf-b31a-923ae2f92fff" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:04:48 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] Lock "b5e2e065-1b7d-4cbf-b31a-923ae2f92fff" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:04:48 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Starting instance... {{(pid=71474) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 21 14:04:48 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:04:48 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:04:48 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71474) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 21 14:04:48 user nova-compute[71474]: INFO nova.compute.claims [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Claim successful on node user Apr 21 14:04:49 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] Inventory has not changed in ProviderTree for provider: 4e62c1ab-67bb-43ed-8389-61deb50e98d7 {{(pid=71474) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 14:04:49 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] Inventory has not changed for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 14:04:49 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.339s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:04:49 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Start building networks asynchronously for instance. {{(pid=71474) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 21 14:04:49 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Allocating IP information in the background. {{(pid=71474) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 21 14:04:49 user nova-compute[71474]: DEBUG nova.network.neutron [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] allocate_for_instance() {{(pid=71474) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 21 14:04:49 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 21 14:04:49 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Start building block device mappings for instance. {{(pid=71474) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 21 14:04:49 user nova-compute[71474]: DEBUG nova.policy [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '95f0f10528294c9bb3d4f58f3361c358', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '41c39fcb224f4e69a73734be43ba6588', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71474) authorize /opt/stack/nova/nova/policy.py:203}} Apr 21 14:04:49 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Start spawning the instance on the hypervisor. {{(pid=71474) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 21 14:04:49 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Creating instance directory {{(pid=71474) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 21 14:04:49 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Creating image(s) Apr 21 14:04:49 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] Acquiring lock "/opt/stack/data/nova/instances/b5e2e065-1b7d-4cbf-b31a-923ae2f92fff/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:04:49 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] Lock "/opt/stack/data/nova/instances/b5e2e065-1b7d-4cbf-b31a-923ae2f92fff/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:04:49 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] Lock "/opt/stack/data/nova/instances/b5e2e065-1b7d-4cbf-b31a-923ae2f92fff/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:04:49 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:04:49 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json" returned: 0 in 0.129s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:04:49 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] Acquiring lock "8e8c288cb98f22f6af31ad55f38b7baa81c260d7" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:04:49 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] Lock "8e8c288cb98f22f6af31ad55f38b7baa81c260d7" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:04:49 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:04:49 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json" returned: 0 in 0.136s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:04:49 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7,backing_fmt=raw /opt/stack/data/nova/instances/b5e2e065-1b7d-4cbf-b31a-923ae2f92fff/disk 1073741824 {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:04:49 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7,backing_fmt=raw /opt/stack/data/nova/instances/b5e2e065-1b7d-4cbf-b31a-923ae2f92fff/disk 1073741824" returned: 0 in 0.043s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:04:49 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] Lock "8e8c288cb98f22f6af31ad55f38b7baa81c260d7" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.186s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:04:49 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:04:49 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json" returned: 0 in 0.126s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:04:49 user nova-compute[71474]: DEBUG nova.virt.disk.api [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] Checking if we can resize image /opt/stack/data/nova/instances/b5e2e065-1b7d-4cbf-b31a-923ae2f92fff/disk. size=1073741824 {{(pid=71474) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 21 14:04:49 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b5e2e065-1b7d-4cbf-b31a-923ae2f92fff/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:04:49 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b5e2e065-1b7d-4cbf-b31a-923ae2f92fff/disk --force-share --output=json" returned: 0 in 0.138s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:04:49 user nova-compute[71474]: DEBUG nova.virt.disk.api [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] Cannot resize image /opt/stack/data/nova/instances/b5e2e065-1b7d-4cbf-b31a-923ae2f92fff/disk to a smaller size. {{(pid=71474) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 21 14:04:49 user nova-compute[71474]: DEBUG nova.objects.instance [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] Lazy-loading 'migration_context' on Instance uuid b5e2e065-1b7d-4cbf-b31a-923ae2f92fff {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 14:04:50 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Created local disks {{(pid=71474) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 21 14:04:50 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Ensure instance console log exists: /opt/stack/data/nova/instances/b5e2e065-1b7d-4cbf-b31a-923ae2f92fff/console.log {{(pid=71474) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 21 14:04:50 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:04:50 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:04:50 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:04:50 user nova-compute[71474]: DEBUG nova.network.neutron [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Successfully created port: 7eb11528-a882-4084-a2c7-b36fd432fecf {{(pid=71474) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 21 14:04:50 user nova-compute[71474]: DEBUG nova.network.neutron [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Successfully updated port: 7eb11528-a882-4084-a2c7-b36fd432fecf {{(pid=71474) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 21 14:04:50 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] Acquiring lock "refresh_cache-b5e2e065-1b7d-4cbf-b31a-923ae2f92fff" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 14:04:50 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] Acquired lock "refresh_cache-b5e2e065-1b7d-4cbf-b31a-923ae2f92fff" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 14:04:50 user nova-compute[71474]: DEBUG nova.network.neutron [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Building network info cache for instance {{(pid=71474) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 21 14:04:50 user nova-compute[71474]: DEBUG nova.compute.manager [req-d27eba96-d321-41dd-848f-4c108fdbce82 req-f4ff79e9-15bd-4d27-9442-9aa829fa89d4 service nova] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Received event network-changed-7eb11528-a882-4084-a2c7-b36fd432fecf {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:04:50 user nova-compute[71474]: DEBUG nova.compute.manager [req-d27eba96-d321-41dd-848f-4c108fdbce82 req-f4ff79e9-15bd-4d27-9442-9aa829fa89d4 service nova] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Refreshing instance network info cache due to event network-changed-7eb11528-a882-4084-a2c7-b36fd432fecf. {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 21 14:04:50 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-d27eba96-d321-41dd-848f-4c108fdbce82 req-f4ff79e9-15bd-4d27-9442-9aa829fa89d4 service nova] Acquiring lock "refresh_cache-b5e2e065-1b7d-4cbf-b31a-923ae2f92fff" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 14:04:50 user nova-compute[71474]: DEBUG nova.network.neutron [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Instance cache missing network info. {{(pid=71474) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 21 14:04:50 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:04:51 user nova-compute[71474]: DEBUG nova.network.neutron [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Updating instance_info_cache with network_info: [{"id": "7eb11528-a882-4084-a2c7-b36fd432fecf", "address": "fa:16:3e:05:9e:ea", "network": {"id": "d9138a89-3d80-4ef8-b937-1613f614c9e8", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-2095900346-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "41c39fcb224f4e69a73734be43ba6588", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7eb11528-a8", "ovs_interfaceid": "7eb11528-a882-4084-a2c7-b36fd432fecf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:04:51 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] Releasing lock "refresh_cache-b5e2e065-1b7d-4cbf-b31a-923ae2f92fff" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 14:04:51 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Instance network_info: |[{"id": "7eb11528-a882-4084-a2c7-b36fd432fecf", "address": "fa:16:3e:05:9e:ea", "network": {"id": "d9138a89-3d80-4ef8-b937-1613f614c9e8", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-2095900346-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "41c39fcb224f4e69a73734be43ba6588", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7eb11528-a8", "ovs_interfaceid": "7eb11528-a882-4084-a2c7-b36fd432fecf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71474) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 21 14:04:51 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-d27eba96-d321-41dd-848f-4c108fdbce82 req-f4ff79e9-15bd-4d27-9442-9aa829fa89d4 service nova] Acquired lock "refresh_cache-b5e2e065-1b7d-4cbf-b31a-923ae2f92fff" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 14:04:51 user nova-compute[71474]: DEBUG nova.network.neutron [req-d27eba96-d321-41dd-848f-4c108fdbce82 req-f4ff79e9-15bd-4d27-9442-9aa829fa89d4 service nova] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Refreshing network info cache for port 7eb11528-a882-4084-a2c7-b36fd432fecf {{(pid=71474) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 21 14:04:51 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Start _get_guest_xml network_info=[{"id": "7eb11528-a882-4084-a2c7-b36fd432fecf", "address": "fa:16:3e:05:9e:ea", "network": {"id": "d9138a89-3d80-4ef8-b937-1613f614c9e8", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-2095900346-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "41c39fcb224f4e69a73734be43ba6588", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7eb11528-a8", "ovs_interfaceid": "7eb11528-a882-4084-a2c7-b36fd432fecf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T13:54:16Z,direct_url=,disk_format='qcow2',id=2edfef44-2867-4e03-a53e-b139f99afa75,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='36a44032fda748c1965c722304fa176d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T13:54:18Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'guest_format': None, 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'image_id': '2edfef44-2867-4e03-a53e-b139f99afa75'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71474) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 21 14:04:51 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 14:04:51 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 14:04:51 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71474) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 21 14:04:51 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-21T13:55:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T13:54:16Z,direct_url=,disk_format='qcow2',id=2edfef44-2867-4e03-a53e-b139f99afa75,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='36a44032fda748c1965c722304fa176d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T13:54:18Z,virtual_size=,visibility=), allow threads: True {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 21 14:04:51 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] Flavor limits 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 21 14:04:51 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] Image limits 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 21 14:04:51 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] Flavor pref 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 21 14:04:51 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] Image pref 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 21 14:04:51 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 21 14:04:51 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 21 14:04:51 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71474) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 21 14:04:51 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] Got 1 possible topologies {{(pid=71474) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 21 14:04:51 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 21 14:04:51 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 21 14:04:51 user nova-compute[71474]: DEBUG nova.virt.libvirt.vif [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T14:04:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-564594493',display_name='tempest-ServerActionsTestJSON-server-564594493',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serveractionstestjson-server-564594493',id=20,image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOBtS2vU2hjTdBp9+5GXmMtOFB68EC/JBp4srwATzJ0qLZ+BQocGkSEAP1z1S9M9P1kEv0Vd6quAa1O8JdG4KfvkaPJsmlaSpX/6CyeVnURB0GwGeWV66UBeLbeyknARPA==',key_name='tempest-keypair-2085847639',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='41c39fcb224f4e69a73734be43ba6588',ramdisk_id='',reservation_id='r-62aqkdvj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-2051074452',owner_user_name='tempest-ServerActionsTestJSON-2051074452-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T14:04:49Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='95f0f10528294c9bb3d4f58f3361c358',uuid=b5e2e065-1b7d-4cbf-b31a-923ae2f92fff,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7eb11528-a882-4084-a2c7-b36fd432fecf", "address": "fa:16:3e:05:9e:ea", "network": {"id": "d9138a89-3d80-4ef8-b937-1613f614c9e8", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-2095900346-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "41c39fcb224f4e69a73734be43ba6588", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7eb11528-a8", "ovs_interfaceid": "7eb11528-a882-4084-a2c7-b36fd432fecf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71474) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 21 14:04:51 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] Converting VIF {"id": "7eb11528-a882-4084-a2c7-b36fd432fecf", "address": "fa:16:3e:05:9e:ea", "network": {"id": "d9138a89-3d80-4ef8-b937-1613f614c9e8", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-2095900346-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "41c39fcb224f4e69a73734be43ba6588", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7eb11528-a8", "ovs_interfaceid": "7eb11528-a882-4084-a2c7-b36fd432fecf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 14:04:51 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:05:9e:ea,bridge_name='br-int',has_traffic_filtering=True,id=7eb11528-a882-4084-a2c7-b36fd432fecf,network=Network(d9138a89-3d80-4ef8-b937-1613f614c9e8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7eb11528-a8') {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 14:04:51 user nova-compute[71474]: DEBUG nova.objects.instance [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] Lazy-loading 'pci_devices' on Instance uuid b5e2e065-1b7d-4cbf-b31a-923ae2f92fff {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 14:04:51 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] End _get_guest_xml xml= Apr 21 14:04:51 user nova-compute[71474]: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff Apr 21 14:04:51 user nova-compute[71474]: instance-00000014 Apr 21 14:04:51 user nova-compute[71474]: 131072 Apr 21 14:04:51 user nova-compute[71474]: 1 Apr 21 14:04:51 user nova-compute[71474]: Apr 21 14:04:51 user nova-compute[71474]: Apr 21 14:04:51 user nova-compute[71474]: Apr 21 14:04:51 user nova-compute[71474]: tempest-ServerActionsTestJSON-server-564594493 Apr 21 14:04:51 user nova-compute[71474]: 2023-04-21 14:04:51 Apr 21 14:04:51 user nova-compute[71474]: Apr 21 14:04:51 user nova-compute[71474]: 128 Apr 21 14:04:51 user nova-compute[71474]: 1 Apr 21 14:04:51 user nova-compute[71474]: 0 Apr 21 14:04:51 user nova-compute[71474]: 0 Apr 21 14:04:51 user nova-compute[71474]: 1 Apr 21 14:04:51 user nova-compute[71474]: Apr 21 14:04:51 user nova-compute[71474]: Apr 21 14:04:51 user nova-compute[71474]: tempest-ServerActionsTestJSON-2051074452-project-member Apr 21 14:04:51 user nova-compute[71474]: tempest-ServerActionsTestJSON-2051074452 Apr 21 14:04:51 user nova-compute[71474]: Apr 21 14:04:51 user nova-compute[71474]: Apr 21 14:04:51 user nova-compute[71474]: Apr 21 14:04:51 user nova-compute[71474]: Apr 21 14:04:51 user nova-compute[71474]: Apr 21 14:04:51 user nova-compute[71474]: Apr 21 14:04:51 user nova-compute[71474]: Apr 21 14:04:51 user nova-compute[71474]: Apr 21 14:04:51 user nova-compute[71474]: Apr 21 14:04:51 user nova-compute[71474]: Apr 21 14:04:51 user nova-compute[71474]: Apr 21 14:04:51 user nova-compute[71474]: OpenStack Foundation Apr 21 14:04:51 user nova-compute[71474]: OpenStack Nova Apr 21 14:04:51 user nova-compute[71474]: 0.0.0 Apr 21 14:04:51 user nova-compute[71474]: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff Apr 21 14:04:51 user nova-compute[71474]: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff Apr 21 14:04:51 user nova-compute[71474]: Virtual Machine Apr 21 14:04:51 user nova-compute[71474]: Apr 21 14:04:51 user nova-compute[71474]: Apr 21 14:04:51 user nova-compute[71474]: Apr 21 14:04:51 user nova-compute[71474]: hvm Apr 21 14:04:51 user nova-compute[71474]: Apr 21 14:04:51 user nova-compute[71474]: Apr 21 14:04:51 user nova-compute[71474]: Apr 21 14:04:51 user nova-compute[71474]: Apr 21 14:04:51 user nova-compute[71474]: Apr 21 14:04:51 user nova-compute[71474]: Apr 21 14:04:51 user nova-compute[71474]: Apr 21 14:04:51 user nova-compute[71474]: Apr 21 14:04:51 user nova-compute[71474]: Apr 21 14:04:51 user nova-compute[71474]: Apr 21 14:04:51 user nova-compute[71474]: Apr 21 14:04:51 user nova-compute[71474]: Apr 21 14:04:51 user nova-compute[71474]: Apr 21 14:04:51 user nova-compute[71474]: Apr 21 14:04:51 user nova-compute[71474]: Nehalem Apr 21 14:04:51 user nova-compute[71474]: Apr 21 14:04:51 user nova-compute[71474]: Apr 21 14:04:51 user nova-compute[71474]: Apr 21 14:04:51 user nova-compute[71474]: Apr 21 14:04:51 user nova-compute[71474]: Apr 21 14:04:51 user nova-compute[71474]: Apr 21 14:04:51 user nova-compute[71474]: Apr 21 14:04:51 user nova-compute[71474]: Apr 21 14:04:51 user nova-compute[71474]: Apr 21 14:04:51 user nova-compute[71474]: Apr 21 14:04:51 user nova-compute[71474]: Apr 21 14:04:51 user nova-compute[71474]: Apr 21 14:04:51 user nova-compute[71474]: Apr 21 14:04:51 user nova-compute[71474]: Apr 21 14:04:51 user nova-compute[71474]: Apr 21 14:04:51 user nova-compute[71474]: Apr 21 14:04:51 user nova-compute[71474]: Apr 21 14:04:51 user nova-compute[71474]: Apr 21 14:04:51 user nova-compute[71474]: Apr 21 14:04:51 user nova-compute[71474]: Apr 21 14:04:51 user nova-compute[71474]: /dev/urandom Apr 21 14:04:51 user nova-compute[71474]: Apr 21 14:04:51 user nova-compute[71474]: Apr 21 14:04:51 user nova-compute[71474]: Apr 21 14:04:51 user nova-compute[71474]: Apr 21 14:04:51 user nova-compute[71474]: Apr 21 14:04:51 user nova-compute[71474]: Apr 21 14:04:51 user nova-compute[71474]: Apr 21 14:04:51 user nova-compute[71474]: {{(pid=71474) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 21 14:04:51 user nova-compute[71474]: DEBUG nova.virt.libvirt.vif [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T14:04:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-564594493',display_name='tempest-ServerActionsTestJSON-server-564594493',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serveractionstestjson-server-564594493',id=20,image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOBtS2vU2hjTdBp9+5GXmMtOFB68EC/JBp4srwATzJ0qLZ+BQocGkSEAP1z1S9M9P1kEv0Vd6quAa1O8JdG4KfvkaPJsmlaSpX/6CyeVnURB0GwGeWV66UBeLbeyknARPA==',key_name='tempest-keypair-2085847639',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='41c39fcb224f4e69a73734be43ba6588',ramdisk_id='',reservation_id='r-62aqkdvj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-2051074452',owner_user_name='tempest-ServerActionsTestJSON-2051074452-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T14:04:49Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='95f0f10528294c9bb3d4f58f3361c358',uuid=b5e2e065-1b7d-4cbf-b31a-923ae2f92fff,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7eb11528-a882-4084-a2c7-b36fd432fecf", "address": "fa:16:3e:05:9e:ea", "network": {"id": "d9138a89-3d80-4ef8-b937-1613f614c9e8", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-2095900346-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "41c39fcb224f4e69a73734be43ba6588", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7eb11528-a8", "ovs_interfaceid": "7eb11528-a882-4084-a2c7-b36fd432fecf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 21 14:04:51 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] Converting VIF {"id": "7eb11528-a882-4084-a2c7-b36fd432fecf", "address": "fa:16:3e:05:9e:ea", "network": {"id": "d9138a89-3d80-4ef8-b937-1613f614c9e8", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-2095900346-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "41c39fcb224f4e69a73734be43ba6588", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7eb11528-a8", "ovs_interfaceid": "7eb11528-a882-4084-a2c7-b36fd432fecf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 14:04:51 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:05:9e:ea,bridge_name='br-int',has_traffic_filtering=True,id=7eb11528-a882-4084-a2c7-b36fd432fecf,network=Network(d9138a89-3d80-4ef8-b937-1613f614c9e8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7eb11528-a8') {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 14:04:51 user nova-compute[71474]: DEBUG os_vif [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:05:9e:ea,bridge_name='br-int',has_traffic_filtering=True,id=7eb11528-a882-4084-a2c7-b36fd432fecf,network=Network(d9138a89-3d80-4ef8-b937-1613f614c9e8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7eb11528-a8') {{(pid=71474) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 21 14:04:51 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:04:51 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 14:04:51 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 21 14:04:51 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:04:51 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7eb11528-a8, may_exist=True) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 14:04:51 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7eb11528-a8, col_values=(('external_ids', {'iface-id': '7eb11528-a882-4084-a2c7-b36fd432fecf', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:05:9e:ea', 'vm-uuid': 'b5e2e065-1b7d-4cbf-b31a-923ae2f92fff'}),)) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 14:04:51 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:04:51 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 14:04:51 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:04:51 user nova-compute[71474]: INFO os_vif [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:05:9e:ea,bridge_name='br-int',has_traffic_filtering=True,id=7eb11528-a882-4084-a2c7-b36fd432fecf,network=Network(d9138a89-3d80-4ef8-b937-1613f614c9e8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7eb11528-a8') Apr 21 14:04:51 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] No BDM found with device name vda, not building metadata. {{(pid=71474) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 21 14:04:51 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] No VIF found with MAC fa:16:3e:05:9e:ea, not building metadata {{(pid=71474) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 21 14:04:51 user nova-compute[71474]: DEBUG nova.network.neutron [req-d27eba96-d321-41dd-848f-4c108fdbce82 req-f4ff79e9-15bd-4d27-9442-9aa829fa89d4 service nova] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Updated VIF entry in instance network info cache for port 7eb11528-a882-4084-a2c7-b36fd432fecf. {{(pid=71474) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 21 14:04:51 user nova-compute[71474]: DEBUG nova.network.neutron [req-d27eba96-d321-41dd-848f-4c108fdbce82 req-f4ff79e9-15bd-4d27-9442-9aa829fa89d4 service nova] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Updating instance_info_cache with network_info: [{"id": "7eb11528-a882-4084-a2c7-b36fd432fecf", "address": "fa:16:3e:05:9e:ea", "network": {"id": "d9138a89-3d80-4ef8-b937-1613f614c9e8", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-2095900346-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "41c39fcb224f4e69a73734be43ba6588", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7eb11528-a8", "ovs_interfaceid": "7eb11528-a882-4084-a2c7-b36fd432fecf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:04:51 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-d27eba96-d321-41dd-848f-4c108fdbce82 req-f4ff79e9-15bd-4d27-9442-9aa829fa89d4 service nova] Releasing lock "refresh_cache-b5e2e065-1b7d-4cbf-b31a-923ae2f92fff" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 14:04:52 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:04:52 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:04:52 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:04:53 user nova-compute[71474]: DEBUG nova.compute.manager [req-45345ea1-cb06-435c-8f7e-49c0560e14da req-af16ce61-fabb-4827-8925-fc8899feb0af service nova] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Received event network-vif-plugged-7eb11528-a882-4084-a2c7-b36fd432fecf {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:04:53 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-45345ea1-cb06-435c-8f7e-49c0560e14da req-af16ce61-fabb-4827-8925-fc8899feb0af service nova] Acquiring lock "b5e2e065-1b7d-4cbf-b31a-923ae2f92fff-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:04:53 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-45345ea1-cb06-435c-8f7e-49c0560e14da req-af16ce61-fabb-4827-8925-fc8899feb0af service nova] Lock "b5e2e065-1b7d-4cbf-b31a-923ae2f92fff-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:04:53 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-45345ea1-cb06-435c-8f7e-49c0560e14da req-af16ce61-fabb-4827-8925-fc8899feb0af service nova] Lock "b5e2e065-1b7d-4cbf-b31a-923ae2f92fff-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:04:53 user nova-compute[71474]: DEBUG nova.compute.manager [req-45345ea1-cb06-435c-8f7e-49c0560e14da req-af16ce61-fabb-4827-8925-fc8899feb0af service nova] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] No waiting events found dispatching network-vif-plugged-7eb11528-a882-4084-a2c7-b36fd432fecf {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:04:53 user nova-compute[71474]: WARNING nova.compute.manager [req-45345ea1-cb06-435c-8f7e-49c0560e14da req-af16ce61-fabb-4827-8925-fc8899feb0af service nova] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Received unexpected event network-vif-plugged-7eb11528-a882-4084-a2c7-b36fd432fecf for instance with vm_state building and task_state spawning. Apr 21 14:04:53 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:04:53 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:04:53 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:04:53 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:04:53 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:04:53 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:04:54 user nova-compute[71474]: DEBUG nova.virt.driver [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Emitting event Resumed> {{(pid=71474) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 14:04:54 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] VM Resumed (Lifecycle Event) Apr 21 14:04:54 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Instance event wait completed in 0 seconds for {{(pid=71474) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 21 14:04:54 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Guest created on hypervisor {{(pid=71474) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 21 14:04:54 user nova-compute[71474]: INFO nova.virt.libvirt.driver [-] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Instance spawned successfully. Apr 21 14:04:54 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 21 14:04:54 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 14:04:54 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71474) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 14:04:54 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Found default for hw_cdrom_bus of ide {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:04:54 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Found default for hw_disk_bus of virtio {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:04:54 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Found default for hw_input_bus of None {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:04:54 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Found default for hw_pointer_model of None {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:04:54 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Found default for hw_video_model of virtio {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:04:54 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Found default for hw_vif_model of virtio {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:04:54 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 14:04:54 user nova-compute[71474]: DEBUG nova.virt.driver [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Emitting event Started> {{(pid=71474) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 14:04:54 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] VM Started (Lifecycle Event) Apr 21 14:04:54 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 14:04:54 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71474) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 14:04:54 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 14:04:54 user nova-compute[71474]: INFO nova.compute.manager [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Took 5.43 seconds to spawn the instance on the hypervisor. Apr 21 14:04:54 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 14:04:54 user nova-compute[71474]: INFO nova.compute.manager [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Took 6.08 seconds to build instance. Apr 21 14:04:54 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9d29eff9-da55-47c0-932c-3e451f642049 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] Lock "b5e2e065-1b7d-4cbf-b31a-923ae2f92fff" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.167s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:04:55 user nova-compute[71474]: DEBUG nova.compute.manager [req-d3f097c2-2216-4dae-baef-054628994fdf req-f37cf140-1803-4b13-87ac-a5d0a39624e8 service nova] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Received event network-vif-plugged-7eb11528-a882-4084-a2c7-b36fd432fecf {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:04:55 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-d3f097c2-2216-4dae-baef-054628994fdf req-f37cf140-1803-4b13-87ac-a5d0a39624e8 service nova] Acquiring lock "b5e2e065-1b7d-4cbf-b31a-923ae2f92fff-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:04:55 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-d3f097c2-2216-4dae-baef-054628994fdf req-f37cf140-1803-4b13-87ac-a5d0a39624e8 service nova] Lock "b5e2e065-1b7d-4cbf-b31a-923ae2f92fff-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:04:55 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-d3f097c2-2216-4dae-baef-054628994fdf req-f37cf140-1803-4b13-87ac-a5d0a39624e8 service nova] Lock "b5e2e065-1b7d-4cbf-b31a-923ae2f92fff-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:04:55 user nova-compute[71474]: DEBUG nova.compute.manager [req-d3f097c2-2216-4dae-baef-054628994fdf req-f37cf140-1803-4b13-87ac-a5d0a39624e8 service nova] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] No waiting events found dispatching network-vif-plugged-7eb11528-a882-4084-a2c7-b36fd432fecf {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:04:55 user nova-compute[71474]: WARNING nova.compute.manager [req-d3f097c2-2216-4dae-baef-054628994fdf req-f37cf140-1803-4b13-87ac-a5d0a39624e8 service nova] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Received unexpected event network-vif-plugged-7eb11528-a882-4084-a2c7-b36fd432fecf for instance with vm_state active and task_state None. Apr 21 14:04:55 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:04:55 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Starting heal instance info cache {{(pid=71474) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 21 14:04:55 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquiring lock "refresh_cache-4a44d9f3-28b2-45e7-b952-2bb1735ef5b5" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 14:04:55 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquired lock "refresh_cache-4a44d9f3-28b2-45e7-b952-2bb1735ef5b5" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 14:04:55 user nova-compute[71474]: DEBUG nova.network.neutron [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] Forcefully refreshing network info cache for instance {{(pid=71474) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 21 14:04:55 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:04:56 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:04:56 user nova-compute[71474]: DEBUG nova.network.neutron [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] Updating instance_info_cache with network_info: [{"id": "5c5a52bd-c710-4614-8353-55be240cfa17", "address": "fa:16:3e:66:41:1d", "network": {"id": "4b38afb7-2b53-44fc-a4e0-7d79bef71734", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-935140606-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "885cdc1521a14985bfa70ae21e73c693", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c5a52bd-c7", "ovs_interfaceid": "5c5a52bd-c710-4614-8353-55be240cfa17", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:04:56 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Releasing lock "refresh_cache-4a44d9f3-28b2-45e7-b952-2bb1735ef5b5" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 14:04:56 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] Updated the network info_cache for instance {{(pid=71474) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 21 14:04:56 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:04:56 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:04:56 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:04:56 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:04:56 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:04:56 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Auditing locally available compute resources for user (node: user) {{(pid=71474) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 21 14:04:56 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4a44d9f3-28b2-45e7-b952-2bb1735ef5b5/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:04:56 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4a44d9f3-28b2-45e7-b952-2bb1735ef5b5/disk --force-share --output=json" returned: 0 in 0.141s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:04:56 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4a44d9f3-28b2-45e7-b952-2bb1735ef5b5/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:04:56 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4a44d9f3-28b2-45e7-b952-2bb1735ef5b5/disk --force-share --output=json" returned: 0 in 0.134s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:04:56 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/80eb182f-948b-42d3-999b-339c5d615a73/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:04:56 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/80eb182f-948b-42d3-999b-339c5d615a73/disk --force-share --output=json" returned: 0 in 0.131s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:04:56 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/80eb182f-948b-42d3-999b-339c5d615a73/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:04:57 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/80eb182f-948b-42d3-999b-339c5d615a73/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:04:57 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a205a2a4-c0de-4c5c-abc4-7b034070e014/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:04:57 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a205a2a4-c0de-4c5c-abc4-7b034070e014/disk --force-share --output=json" returned: 0 in 0.138s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:04:57 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a205a2a4-c0de-4c5c-abc4-7b034070e014/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:04:57 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a205a2a4-c0de-4c5c-abc4-7b034070e014/disk --force-share --output=json" returned: 0 in 0.195s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:04:57 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5cf0c20f-ffda-4578-adae-9aaef4c4bd18/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:04:57 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5cf0c20f-ffda-4578-adae-9aaef4c4bd18/disk --force-share --output=json" returned: 0 in 0.142s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:04:57 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5cf0c20f-ffda-4578-adae-9aaef4c4bd18/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:04:57 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5cf0c20f-ffda-4578-adae-9aaef4c4bd18/disk --force-share --output=json" returned: 0 in 0.217s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:04:57 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b5e2e065-1b7d-4cbf-b31a-923ae2f92fff/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:04:57 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b5e2e065-1b7d-4cbf-b31a-923ae2f92fff/disk --force-share --output=json" returned: 0 in 0.133s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:04:57 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b5e2e065-1b7d-4cbf-b31a-923ae2f92fff/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:04:58 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b5e2e065-1b7d-4cbf-b31a-923ae2f92fff/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:04:58 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/aac5a363-5528-4d5f-8c90-6f9ad69a06dd/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:04:58 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/aac5a363-5528-4d5f-8c90-6f9ad69a06dd/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:04:58 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/aac5a363-5528-4d5f-8c90-6f9ad69a06dd/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:04:58 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/aac5a363-5528-4d5f-8c90-6f9ad69a06dd/disk --force-share --output=json" returned: 0 in 0.137s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:04:58 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5e502c4c-a46b-4670-acba-2fda2d05adf5/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:04:58 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5e502c4c-a46b-4670-acba-2fda2d05adf5/disk --force-share --output=json" returned: 0 in 0.132s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:04:58 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5e502c4c-a46b-4670-acba-2fda2d05adf5/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:04:58 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5e502c4c-a46b-4670-acba-2fda2d05adf5/disk --force-share --output=json" returned: 0 in 0.184s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:04:58 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/30068c4a-94ed-4b84-9178-0d554326fc68/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:04:58 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/30068c4a-94ed-4b84-9178-0d554326fc68/disk --force-share --output=json" returned: 0 in 0.129s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:04:58 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/30068c4a-94ed-4b84-9178-0d554326fc68/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:04:59 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/30068c4a-94ed-4b84-9178-0d554326fc68/disk --force-share --output=json" returned: 0 in 0.145s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:04:59 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 14:04:59 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 14:04:59 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Hypervisor/Node resource view: name=user free_ram=8269MB free_disk=25.95494842529297GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71474) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 21 14:04:59 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:04:59 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:04:59 user nova-compute[71474]: INFO nova.compute.manager [None req-5248437d-164f-4a51-9f0c-fdb8e9c2a7b7 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Rescuing Apr 21 14:04:59 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-5248437d-164f-4a51-9f0c-fdb8e9c2a7b7 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Acquiring lock "refresh_cache-80eb182f-948b-42d3-999b-339c5d615a73" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 14:04:59 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-5248437d-164f-4a51-9f0c-fdb8e9c2a7b7 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Acquired lock "refresh_cache-80eb182f-948b-42d3-999b-339c5d615a73" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 14:04:59 user nova-compute[71474]: DEBUG nova.network.neutron [None req-5248437d-164f-4a51-9f0c-fdb8e9c2a7b7 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Building network info cache for instance {{(pid=71474) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 21 14:04:59 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Instance 30068c4a-94ed-4b84-9178-0d554326fc68 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71474) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 14:04:59 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Instance 5e502c4c-a46b-4670-acba-2fda2d05adf5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71474) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 14:04:59 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Instance 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71474) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 14:04:59 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Instance a205a2a4-c0de-4c5c-abc4-7b034070e014 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71474) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 14:04:59 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Instance 80eb182f-948b-42d3-999b-339c5d615a73 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71474) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 14:04:59 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Instance aac5a363-5528-4d5f-8c90-6f9ad69a06dd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71474) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 14:04:59 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Instance 5cf0c20f-ffda-4578-adae-9aaef4c4bd18 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71474) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 14:04:59 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Instance b5e2e065-1b7d-4cbf-b31a-923ae2f92fff actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71474) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 14:04:59 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Total usable vcpus: 12, total allocated vcpus: 8 {{(pid=71474) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 21 14:04:59 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Final resource view: name=user phys_ram=16023MB used_ram=1536MB phys_disk=40GB used_disk=8GB total_vcpus=12 used_vcpus=8 pci_stats=[] {{(pid=71474) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 21 14:04:59 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Inventory has not changed in ProviderTree for provider: 4e62c1ab-67bb-43ed-8389-61deb50e98d7 {{(pid=71474) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 14:04:59 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Inventory has not changed for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 14:04:59 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Compute_service record updated for user:user {{(pid=71474) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 21 14:04:59 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.438s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:04:59 user nova-compute[71474]: DEBUG nova.network.neutron [None req-5248437d-164f-4a51-9f0c-fdb8e9c2a7b7 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Updating instance_info_cache with network_info: [{"id": "def6080a-bf3f-4516-8140-08f463f69eb7", "address": "fa:16:3e:ff:23:c8", "network": {"id": "6942adb6-1e24-4361-9a43-e8b692767b1f", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1408422178-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "432a123307454a44922597d6c9089447", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapdef6080a-bf", "ovs_interfaceid": "def6080a-bf3f-4516-8140-08f463f69eb7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:04:59 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-5248437d-164f-4a51-9f0c-fdb8e9c2a7b7 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Releasing lock "refresh_cache-80eb182f-948b-42d3-999b-339c5d615a73" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 14:05:00 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:05:00 user nova-compute[71474]: DEBUG nova.compute.manager [req-23382440-a121-4ec8-9e86-41d622b31ec8 req-07e72f7b-fb35-4c83-bdce-b94f3ac96517 service nova] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Received event network-vif-unplugged-def6080a-bf3f-4516-8140-08f463f69eb7 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:05:00 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-23382440-a121-4ec8-9e86-41d622b31ec8 req-07e72f7b-fb35-4c83-bdce-b94f3ac96517 service nova] Acquiring lock "80eb182f-948b-42d3-999b-339c5d615a73-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:05:00 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-23382440-a121-4ec8-9e86-41d622b31ec8 req-07e72f7b-fb35-4c83-bdce-b94f3ac96517 service nova] Lock "80eb182f-948b-42d3-999b-339c5d615a73-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:05:00 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-23382440-a121-4ec8-9e86-41d622b31ec8 req-07e72f7b-fb35-4c83-bdce-b94f3ac96517 service nova] Lock "80eb182f-948b-42d3-999b-339c5d615a73-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:05:00 user nova-compute[71474]: DEBUG nova.compute.manager [req-23382440-a121-4ec8-9e86-41d622b31ec8 req-07e72f7b-fb35-4c83-bdce-b94f3ac96517 service nova] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] No waiting events found dispatching network-vif-unplugged-def6080a-bf3f-4516-8140-08f463f69eb7 {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:05:00 user nova-compute[71474]: WARNING nova.compute.manager [req-23382440-a121-4ec8-9e86-41d622b31ec8 req-07e72f7b-fb35-4c83-bdce-b94f3ac96517 service nova] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Received unexpected event network-vif-unplugged-def6080a-bf3f-4516-8140-08f463f69eb7 for instance with vm_state active and task_state rescuing. Apr 21 14:05:00 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:05:00 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:05:00 user nova-compute[71474]: INFO nova.virt.libvirt.driver [-] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Instance destroyed successfully. Apr 21 14:05:00 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-5248437d-164f-4a51-9f0c-fdb8e9c2a7b7 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Attempting rescue Apr 21 14:05:00 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-5248437d-164f-4a51-9f0c-fdb8e9c2a7b7 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}}} {{(pid=71474) rescue /opt/stack/nova/nova/virt/libvirt/driver.py:4289}} Apr 21 14:05:00 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-5248437d-164f-4a51-9f0c-fdb8e9c2a7b7 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Instance directory exists: not creating {{(pid=71474) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4694}} Apr 21 14:05:00 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-5248437d-164f-4a51-9f0c-fdb8e9c2a7b7 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Creating image(s) Apr 21 14:05:00 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-5248437d-164f-4a51-9f0c-fdb8e9c2a7b7 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Acquiring lock "/opt/stack/data/nova/instances/80eb182f-948b-42d3-999b-339c5d615a73/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:05:00 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-5248437d-164f-4a51-9f0c-fdb8e9c2a7b7 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Lock "/opt/stack/data/nova/instances/80eb182f-948b-42d3-999b-339c5d615a73/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:05:00 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-5248437d-164f-4a51-9f0c-fdb8e9c2a7b7 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Lock "/opt/stack/data/nova/instances/80eb182f-948b-42d3-999b-339c5d615a73/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:05:00 user nova-compute[71474]: DEBUG nova.objects.instance [None req-5248437d-164f-4a51-9f0c-fdb8e9c2a7b7 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Lazy-loading 'trusted_certs' on Instance uuid 80eb182f-948b-42d3-999b-339c5d615a73 {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 14:05:00 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-5248437d-164f-4a51-9f0c-fdb8e9c2a7b7 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Acquiring lock "8e8c288cb98f22f6af31ad55f38b7baa81c260d7" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:05:00 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-5248437d-164f-4a51-9f0c-fdb8e9c2a7b7 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Lock "8e8c288cb98f22f6af31ad55f38b7baa81c260d7" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:05:00 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-5248437d-164f-4a51-9f0c-fdb8e9c2a7b7 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:05:00 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-5248437d-164f-4a51-9f0c-fdb8e9c2a7b7 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json" returned: 0 in 0.130s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:05:00 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-5248437d-164f-4a51-9f0c-fdb8e9c2a7b7 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7,backing_fmt=raw /opt/stack/data/nova/instances/80eb182f-948b-42d3-999b-339c5d615a73/disk.rescue {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:05:00 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-5248437d-164f-4a51-9f0c-fdb8e9c2a7b7 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7,backing_fmt=raw /opt/stack/data/nova/instances/80eb182f-948b-42d3-999b-339c5d615a73/disk.rescue" returned: 0 in 0.058s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:05:00 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-5248437d-164f-4a51-9f0c-fdb8e9c2a7b7 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Lock "8e8c288cb98f22f6af31ad55f38b7baa81c260d7" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.194s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:05:00 user nova-compute[71474]: DEBUG nova.objects.instance [None req-5248437d-164f-4a51-9f0c-fdb8e9c2a7b7 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Lazy-loading 'migration_context' on Instance uuid 80eb182f-948b-42d3-999b-339c5d615a73 {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 14:05:00 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-5248437d-164f-4a51-9f0c-fdb8e9c2a7b7 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Created local disks {{(pid=71474) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 21 14:05:00 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-5248437d-164f-4a51-9f0c-fdb8e9c2a7b7 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Start _get_guest_xml network_info=[{"id": "def6080a-bf3f-4516-8140-08f463f69eb7", "address": "fa:16:3e:ff:23:c8", "network": {"id": "6942adb6-1e24-4361-9a43-e8b692767b1f", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1408422178-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueNegativeTestJSON-1408422178-network", "vif_mac": "fa:16:3e:ff:23:c8"}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "432a123307454a44922597d6c9089447", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapdef6080a-bf", "ovs_interfaceid": "def6080a-bf3f-4516-8140-08f463f69eb7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T13:54:16Z,direct_url=,disk_format='qcow2',id=2edfef44-2867-4e03-a53e-b139f99afa75,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='36a44032fda748c1965c722304fa176d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T13:54:18Z,virtual_size=,visibility=) rescue={'image_id': '2edfef44-2867-4e03-a53e-b139f99afa75', 'kernel_id': '', 'ramdisk_id': ''} block_device_info=None {{(pid=71474) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 21 14:05:00 user nova-compute[71474]: DEBUG nova.objects.instance [None req-5248437d-164f-4a51-9f0c-fdb8e9c2a7b7 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Lazy-loading 'resources' on Instance uuid 80eb182f-948b-42d3-999b-339c5d615a73 {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 14:05:00 user nova-compute[71474]: DEBUG nova.objects.instance [None req-5248437d-164f-4a51-9f0c-fdb8e9c2a7b7 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Lazy-loading 'numa_topology' on Instance uuid 80eb182f-948b-42d3-999b-339c5d615a73 {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 14:05:00 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-5248437d-164f-4a51-9f0c-fdb8e9c2a7b7 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 14:05:00 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-5248437d-164f-4a51-9f0c-fdb8e9c2a7b7 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 14:05:00 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-5248437d-164f-4a51-9f0c-fdb8e9c2a7b7 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71474) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 21 14:05:00 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-5248437d-164f-4a51-9f0c-fdb8e9c2a7b7 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-21T13:55:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T13:54:16Z,direct_url=,disk_format='qcow2',id=2edfef44-2867-4e03-a53e-b139f99afa75,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='36a44032fda748c1965c722304fa176d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T13:54:18Z,virtual_size=,visibility=), allow threads: True {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 21 14:05:00 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-5248437d-164f-4a51-9f0c-fdb8e9c2a7b7 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Flavor limits 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 21 14:05:00 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-5248437d-164f-4a51-9f0c-fdb8e9c2a7b7 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Image limits 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 21 14:05:00 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-5248437d-164f-4a51-9f0c-fdb8e9c2a7b7 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Flavor pref 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 21 14:05:00 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-5248437d-164f-4a51-9f0c-fdb8e9c2a7b7 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Image pref 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 21 14:05:00 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-5248437d-164f-4a51-9f0c-fdb8e9c2a7b7 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 21 14:05:00 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-5248437d-164f-4a51-9f0c-fdb8e9c2a7b7 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 21 14:05:00 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-5248437d-164f-4a51-9f0c-fdb8e9c2a7b7 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71474) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 21 14:05:00 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-5248437d-164f-4a51-9f0c-fdb8e9c2a7b7 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Got 1 possible topologies {{(pid=71474) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 21 14:05:00 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-5248437d-164f-4a51-9f0c-fdb8e9c2a7b7 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 21 14:05:00 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-5248437d-164f-4a51-9f0c-fdb8e9c2a7b7 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 21 14:05:00 user nova-compute[71474]: DEBUG nova.objects.instance [None req-5248437d-164f-4a51-9f0c-fdb8e9c2a7b7 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Lazy-loading 'vcpu_model' on Instance uuid 80eb182f-948b-42d3-999b-339c5d615a73 {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 14:05:00 user nova-compute[71474]: DEBUG nova.virt.libvirt.vif [None req-5248437d-164f-4a51-9f0c-fdb8e9c2a7b7 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T14:03:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-718848885',display_name='tempest-ServerRescueNegativeTestJSON-server-718848885',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverrescuenegativetestjson-server-718848885',id=17,image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-21T14:03:16Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='432a123307454a44922597d6c9089447',ramdisk_id='',reservation_id='r-oiv28z2j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServerRescueNegativeTestJSON-193683719',owner_user_name='tempest-ServerRescueNegativeTestJSON-193683719-project-member'},tags=,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T14:03:17Z,user_data=None,user_id='4df58f0cb48f4aa29df57f9c2f632782',uuid=80eb182f-948b-42d3-999b-339c5d615a73,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "def6080a-bf3f-4516-8140-08f463f69eb7", "address": "fa:16:3e:ff:23:c8", "network": {"id": "6942adb6-1e24-4361-9a43-e8b692767b1f", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1408422178-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueNegativeTestJSON-1408422178-network", "vif_mac": "fa:16:3e:ff:23:c8"}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "432a123307454a44922597d6c9089447", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapdef6080a-bf", "ovs_interfaceid": "def6080a-bf3f-4516-8140-08f463f69eb7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71474) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 21 14:05:00 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-5248437d-164f-4a51-9f0c-fdb8e9c2a7b7 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Converting VIF {"id": "def6080a-bf3f-4516-8140-08f463f69eb7", "address": "fa:16:3e:ff:23:c8", "network": {"id": "6942adb6-1e24-4361-9a43-e8b692767b1f", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1408422178-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueNegativeTestJSON-1408422178-network", "vif_mac": "fa:16:3e:ff:23:c8"}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "432a123307454a44922597d6c9089447", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapdef6080a-bf", "ovs_interfaceid": "def6080a-bf3f-4516-8140-08f463f69eb7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 14:05:00 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-5248437d-164f-4a51-9f0c-fdb8e9c2a7b7 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ff:23:c8,bridge_name='br-int',has_traffic_filtering=True,id=def6080a-bf3f-4516-8140-08f463f69eb7,network=Network(6942adb6-1e24-4361-9a43-e8b692767b1f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdef6080a-bf') {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 14:05:00 user nova-compute[71474]: DEBUG nova.objects.instance [None req-5248437d-164f-4a51-9f0c-fdb8e9c2a7b7 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Lazy-loading 'pci_devices' on Instance uuid 80eb182f-948b-42d3-999b-339c5d615a73 {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 14:05:00 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:05:00 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-5248437d-164f-4a51-9f0c-fdb8e9c2a7b7 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] End _get_guest_xml xml= Apr 21 14:05:00 user nova-compute[71474]: 80eb182f-948b-42d3-999b-339c5d615a73 Apr 21 14:05:00 user nova-compute[71474]: instance-00000011 Apr 21 14:05:00 user nova-compute[71474]: 131072 Apr 21 14:05:00 user nova-compute[71474]: 1 Apr 21 14:05:00 user nova-compute[71474]: Apr 21 14:05:00 user nova-compute[71474]: Apr 21 14:05:00 user nova-compute[71474]: Apr 21 14:05:00 user nova-compute[71474]: tempest-ServerRescueNegativeTestJSON-server-718848885 Apr 21 14:05:00 user nova-compute[71474]: 2023-04-21 14:05:00 Apr 21 14:05:00 user nova-compute[71474]: Apr 21 14:05:00 user nova-compute[71474]: 128 Apr 21 14:05:00 user nova-compute[71474]: 1 Apr 21 14:05:00 user nova-compute[71474]: 0 Apr 21 14:05:00 user nova-compute[71474]: 0 Apr 21 14:05:00 user nova-compute[71474]: 1 Apr 21 14:05:00 user nova-compute[71474]: Apr 21 14:05:00 user nova-compute[71474]: Apr 21 14:05:00 user nova-compute[71474]: tempest-ServerRescueNegativeTestJSON-193683719-project-member Apr 21 14:05:00 user nova-compute[71474]: tempest-ServerRescueNegativeTestJSON-193683719 Apr 21 14:05:00 user nova-compute[71474]: Apr 21 14:05:00 user nova-compute[71474]: Apr 21 14:05:00 user nova-compute[71474]: Apr 21 14:05:00 user nova-compute[71474]: Apr 21 14:05:00 user nova-compute[71474]: Apr 21 14:05:00 user nova-compute[71474]: Apr 21 14:05:00 user nova-compute[71474]: Apr 21 14:05:00 user nova-compute[71474]: Apr 21 14:05:00 user nova-compute[71474]: Apr 21 14:05:00 user nova-compute[71474]: Apr 21 14:05:00 user nova-compute[71474]: Apr 21 14:05:00 user nova-compute[71474]: OpenStack Foundation Apr 21 14:05:00 user nova-compute[71474]: OpenStack Nova Apr 21 14:05:00 user nova-compute[71474]: 0.0.0 Apr 21 14:05:00 user nova-compute[71474]: 80eb182f-948b-42d3-999b-339c5d615a73 Apr 21 14:05:00 user nova-compute[71474]: 80eb182f-948b-42d3-999b-339c5d615a73 Apr 21 14:05:00 user nova-compute[71474]: Virtual Machine Apr 21 14:05:00 user nova-compute[71474]: Apr 21 14:05:00 user nova-compute[71474]: Apr 21 14:05:00 user nova-compute[71474]: Apr 21 14:05:00 user nova-compute[71474]: hvm Apr 21 14:05:00 user nova-compute[71474]: Apr 21 14:05:00 user nova-compute[71474]: Apr 21 14:05:00 user nova-compute[71474]: Apr 21 14:05:00 user nova-compute[71474]: Apr 21 14:05:00 user nova-compute[71474]: Apr 21 14:05:00 user nova-compute[71474]: Apr 21 14:05:00 user nova-compute[71474]: Apr 21 14:05:00 user nova-compute[71474]: Apr 21 14:05:00 user nova-compute[71474]: Apr 21 14:05:00 user nova-compute[71474]: Apr 21 14:05:00 user nova-compute[71474]: Apr 21 14:05:00 user nova-compute[71474]: Apr 21 14:05:00 user nova-compute[71474]: Apr 21 14:05:00 user nova-compute[71474]: Nehalem Apr 21 14:05:00 user nova-compute[71474]: Apr 21 14:05:00 user nova-compute[71474]: Apr 21 14:05:00 user nova-compute[71474]: Apr 21 14:05:00 user nova-compute[71474]: Apr 21 14:05:00 user nova-compute[71474]: Apr 21 14:05:00 user nova-compute[71474]: Apr 21 14:05:00 user nova-compute[71474]: Apr 21 14:05:00 user nova-compute[71474]: Apr 21 14:05:00 user nova-compute[71474]: Apr 21 14:05:00 user nova-compute[71474]: Apr 21 14:05:00 user nova-compute[71474]: Apr 21 14:05:00 user nova-compute[71474]: Apr 21 14:05:00 user nova-compute[71474]: Apr 21 14:05:00 user nova-compute[71474]: Apr 21 14:05:00 user nova-compute[71474]: Apr 21 14:05:00 user nova-compute[71474]: Apr 21 14:05:00 user nova-compute[71474]: Apr 21 14:05:00 user nova-compute[71474]: Apr 21 14:05:00 user nova-compute[71474]: Apr 21 14:05:00 user nova-compute[71474]: Apr 21 14:05:00 user nova-compute[71474]: Apr 21 14:05:00 user nova-compute[71474]: Apr 21 14:05:00 user nova-compute[71474]: Apr 21 14:05:00 user nova-compute[71474]: Apr 21 14:05:00 user nova-compute[71474]: Apr 21 14:05:00 user nova-compute[71474]: /dev/urandom Apr 21 14:05:00 user nova-compute[71474]: Apr 21 14:05:00 user nova-compute[71474]: Apr 21 14:05:00 user nova-compute[71474]: Apr 21 14:05:00 user nova-compute[71474]: Apr 21 14:05:00 user nova-compute[71474]: Apr 21 14:05:00 user nova-compute[71474]: Apr 21 14:05:00 user nova-compute[71474]: Apr 21 14:05:00 user nova-compute[71474]: {{(pid=71474) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 21 14:05:01 user nova-compute[71474]: INFO nova.virt.libvirt.driver [-] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Instance destroyed successfully. Apr 21 14:05:01 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-5248437d-164f-4a51-9f0c-fdb8e9c2a7b7 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] No BDM found with device name vda, not building metadata. {{(pid=71474) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 21 14:05:01 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-5248437d-164f-4a51-9f0c-fdb8e9c2a7b7 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] No BDM found with device name vdb, not building metadata. {{(pid=71474) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 21 14:05:01 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-5248437d-164f-4a51-9f0c-fdb8e9c2a7b7 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] No VIF found with MAC fa:16:3e:ff:23:c8, not building metadata {{(pid=71474) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 21 14:05:01 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:05:01 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:05:01 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:05:01 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:05:01 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71474) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 21 14:05:02 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:05:02 user nova-compute[71474]: DEBUG nova.compute.manager [req-2cddcfe4-dfca-47da-bd9e-d4550c2371dc req-b873639f-5c3d-45b2-9bb6-9d466ac3ba7d service nova] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Received event network-vif-plugged-def6080a-bf3f-4516-8140-08f463f69eb7 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:05:02 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-2cddcfe4-dfca-47da-bd9e-d4550c2371dc req-b873639f-5c3d-45b2-9bb6-9d466ac3ba7d service nova] Acquiring lock "80eb182f-948b-42d3-999b-339c5d615a73-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:05:02 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-2cddcfe4-dfca-47da-bd9e-d4550c2371dc req-b873639f-5c3d-45b2-9bb6-9d466ac3ba7d service nova] Lock "80eb182f-948b-42d3-999b-339c5d615a73-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:05:02 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-2cddcfe4-dfca-47da-bd9e-d4550c2371dc req-b873639f-5c3d-45b2-9bb6-9d466ac3ba7d service nova] Lock "80eb182f-948b-42d3-999b-339c5d615a73-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:05:02 user nova-compute[71474]: DEBUG nova.compute.manager [req-2cddcfe4-dfca-47da-bd9e-d4550c2371dc req-b873639f-5c3d-45b2-9bb6-9d466ac3ba7d service nova] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] No waiting events found dispatching network-vif-plugged-def6080a-bf3f-4516-8140-08f463f69eb7 {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:05:02 user nova-compute[71474]: WARNING nova.compute.manager [req-2cddcfe4-dfca-47da-bd9e-d4550c2371dc req-b873639f-5c3d-45b2-9bb6-9d466ac3ba7d service nova] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Received unexpected event network-vif-plugged-def6080a-bf3f-4516-8140-08f463f69eb7 for instance with vm_state active and task_state rescuing. Apr 21 14:05:02 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:05:02 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:05:02 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:05:02 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:05:02 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:05:04 user nova-compute[71474]: DEBUG nova.compute.manager [req-11db2380-857a-4eab-887d-b70a4140cde5 req-4b2b91bb-bedd-4fbd-8090-1a9455a9f565 service nova] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Received event network-vif-plugged-def6080a-bf3f-4516-8140-08f463f69eb7 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:05:04 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-11db2380-857a-4eab-887d-b70a4140cde5 req-4b2b91bb-bedd-4fbd-8090-1a9455a9f565 service nova] Acquiring lock "80eb182f-948b-42d3-999b-339c5d615a73-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:05:04 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-11db2380-857a-4eab-887d-b70a4140cde5 req-4b2b91bb-bedd-4fbd-8090-1a9455a9f565 service nova] Lock "80eb182f-948b-42d3-999b-339c5d615a73-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:05:04 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-11db2380-857a-4eab-887d-b70a4140cde5 req-4b2b91bb-bedd-4fbd-8090-1a9455a9f565 service nova] Lock "80eb182f-948b-42d3-999b-339c5d615a73-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:05:04 user nova-compute[71474]: DEBUG nova.compute.manager [req-11db2380-857a-4eab-887d-b70a4140cde5 req-4b2b91bb-bedd-4fbd-8090-1a9455a9f565 service nova] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] No waiting events found dispatching network-vif-plugged-def6080a-bf3f-4516-8140-08f463f69eb7 {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:05:04 user nova-compute[71474]: WARNING nova.compute.manager [req-11db2380-857a-4eab-887d-b70a4140cde5 req-4b2b91bb-bedd-4fbd-8090-1a9455a9f565 service nova] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Received unexpected event network-vif-plugged-def6080a-bf3f-4516-8140-08f463f69eb7 for instance with vm_state active and task_state rescuing. Apr 21 14:05:04 user nova-compute[71474]: DEBUG nova.compute.manager [req-11db2380-857a-4eab-887d-b70a4140cde5 req-4b2b91bb-bedd-4fbd-8090-1a9455a9f565 service nova] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Received event network-vif-plugged-def6080a-bf3f-4516-8140-08f463f69eb7 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:05:04 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-11db2380-857a-4eab-887d-b70a4140cde5 req-4b2b91bb-bedd-4fbd-8090-1a9455a9f565 service nova] Acquiring lock "80eb182f-948b-42d3-999b-339c5d615a73-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:05:04 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-11db2380-857a-4eab-887d-b70a4140cde5 req-4b2b91bb-bedd-4fbd-8090-1a9455a9f565 service nova] Lock "80eb182f-948b-42d3-999b-339c5d615a73-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:05:04 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-11db2380-857a-4eab-887d-b70a4140cde5 req-4b2b91bb-bedd-4fbd-8090-1a9455a9f565 service nova] Lock "80eb182f-948b-42d3-999b-339c5d615a73-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:05:04 user nova-compute[71474]: DEBUG nova.compute.manager [req-11db2380-857a-4eab-887d-b70a4140cde5 req-4b2b91bb-bedd-4fbd-8090-1a9455a9f565 service nova] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] No waiting events found dispatching network-vif-plugged-def6080a-bf3f-4516-8140-08f463f69eb7 {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:05:04 user nova-compute[71474]: WARNING nova.compute.manager [req-11db2380-857a-4eab-887d-b70a4140cde5 req-4b2b91bb-bedd-4fbd-8090-1a9455a9f565 service nova] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Received unexpected event network-vif-plugged-def6080a-bf3f-4516-8140-08f463f69eb7 for instance with vm_state active and task_state rescuing. Apr 21 14:05:04 user nova-compute[71474]: DEBUG nova.virt.libvirt.host [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Removed pending event for 80eb182f-948b-42d3-999b-339c5d615a73 due to event {{(pid=71474) _event_emit_delayed /opt/stack/nova/nova/virt/libvirt/host.py:438}} Apr 21 14:05:04 user nova-compute[71474]: DEBUG nova.virt.driver [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Emitting event Resumed> {{(pid=71474) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 14:05:04 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] VM Resumed (Lifecycle Event) Apr 21 14:05:04 user nova-compute[71474]: DEBUG nova.compute.manager [None req-5248437d-164f-4a51-9f0c-fdb8e9c2a7b7 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 14:05:04 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 14:05:04 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 {{(pid=71474) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 14:05:04 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] During sync_power_state the instance has a pending task (rescuing). Skip. Apr 21 14:05:04 user nova-compute[71474]: DEBUG nova.virt.driver [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Emitting event Started> {{(pid=71474) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 14:05:04 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] VM Started (Lifecycle Event) Apr 21 14:05:04 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 14:05:04 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 {{(pid=71474) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 14:05:05 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:05:06 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:05:10 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:05:11 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:05:15 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:05:16 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:05:20 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:05:21 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:05:25 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:05:26 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:05:31 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 14:05:31 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:05:31 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71474) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 21 14:05:31 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71474) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 14:05:31 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71474) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 14:05:31 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:05:35 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:05:36 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:05:40 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:05:41 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:05:44 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-6d5776b1-8d73-4ca5-b971-6a445d7be8bf tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Acquiring lock "4e4878d9-5766-433a-9f81-4bb92c369e71" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:05:44 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-6d5776b1-8d73-4ca5-b971-6a445d7be8bf tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Lock "4e4878d9-5766-433a-9f81-4bb92c369e71" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:05:44 user nova-compute[71474]: DEBUG nova.compute.manager [None req-6d5776b1-8d73-4ca5-b971-6a445d7be8bf tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 4e4878d9-5766-433a-9f81-4bb92c369e71] Starting instance... {{(pid=71474) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 21 14:05:44 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-6d5776b1-8d73-4ca5-b971-6a445d7be8bf tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:05:44 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-6d5776b1-8d73-4ca5-b971-6a445d7be8bf tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:05:44 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-6d5776b1-8d73-4ca5-b971-6a445d7be8bf tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71474) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 21 14:05:44 user nova-compute[71474]: INFO nova.compute.claims [None req-6d5776b1-8d73-4ca5-b971-6a445d7be8bf tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 4e4878d9-5766-433a-9f81-4bb92c369e71] Claim successful on node user Apr 21 14:05:45 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-6d5776b1-8d73-4ca5-b971-6a445d7be8bf tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Inventory has not changed in ProviderTree for provider: 4e62c1ab-67bb-43ed-8389-61deb50e98d7 {{(pid=71474) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 14:05:45 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-6d5776b1-8d73-4ca5-b971-6a445d7be8bf tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Inventory has not changed for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 14:05:45 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-6d5776b1-8d73-4ca5-b971-6a445d7be8bf tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.448s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:05:45 user nova-compute[71474]: DEBUG nova.compute.manager [None req-6d5776b1-8d73-4ca5-b971-6a445d7be8bf tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 4e4878d9-5766-433a-9f81-4bb92c369e71] Start building networks asynchronously for instance. {{(pid=71474) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 21 14:05:45 user nova-compute[71474]: DEBUG nova.compute.manager [None req-6d5776b1-8d73-4ca5-b971-6a445d7be8bf tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 4e4878d9-5766-433a-9f81-4bb92c369e71] Allocating IP information in the background. {{(pid=71474) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 21 14:05:45 user nova-compute[71474]: DEBUG nova.network.neutron [None req-6d5776b1-8d73-4ca5-b971-6a445d7be8bf tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 4e4878d9-5766-433a-9f81-4bb92c369e71] allocate_for_instance() {{(pid=71474) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 21 14:05:45 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-6d5776b1-8d73-4ca5-b971-6a445d7be8bf tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 4e4878d9-5766-433a-9f81-4bb92c369e71] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 21 14:05:45 user nova-compute[71474]: DEBUG nova.compute.manager [None req-6d5776b1-8d73-4ca5-b971-6a445d7be8bf tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 4e4878d9-5766-433a-9f81-4bb92c369e71] Start building block device mappings for instance. {{(pid=71474) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 21 14:05:45 user nova-compute[71474]: INFO nova.virt.block_device [None req-6d5776b1-8d73-4ca5-b971-6a445d7be8bf tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 4e4878d9-5766-433a-9f81-4bb92c369e71] Booting with volume-backed-image 2edfef44-2867-4e03-a53e-b139f99afa75 at /dev/vda Apr 21 14:05:45 user nova-compute[71474]: DEBUG nova.policy [None req-6d5776b1-8d73-4ca5-b971-6a445d7be8bf tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '132913991f8c45c1adaf5db7ef7cea30', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '885cdc1521a14985bfa70ae21e73c693', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71474) authorize /opt/stack/nova/nova/policy.py:203}} Apr 21 14:05:46 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:05:46 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:05:46 user nova-compute[71474]: DEBUG nova.network.neutron [None req-6d5776b1-8d73-4ca5-b971-6a445d7be8bf tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 4e4878d9-5766-433a-9f81-4bb92c369e71] Successfully created port: a6133e22-45a6-4a63-8885-91a7b439d45b {{(pid=71474) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 21 14:05:46 user nova-compute[71474]: DEBUG nova.network.neutron [None req-6d5776b1-8d73-4ca5-b971-6a445d7be8bf tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 4e4878d9-5766-433a-9f81-4bb92c369e71] Successfully updated port: a6133e22-45a6-4a63-8885-91a7b439d45b {{(pid=71474) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 21 14:05:46 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-6d5776b1-8d73-4ca5-b971-6a445d7be8bf tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Acquiring lock "refresh_cache-4e4878d9-5766-433a-9f81-4bb92c369e71" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 14:05:46 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-6d5776b1-8d73-4ca5-b971-6a445d7be8bf tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Acquired lock "refresh_cache-4e4878d9-5766-433a-9f81-4bb92c369e71" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 14:05:46 user nova-compute[71474]: DEBUG nova.network.neutron [None req-6d5776b1-8d73-4ca5-b971-6a445d7be8bf tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 4e4878d9-5766-433a-9f81-4bb92c369e71] Building network info cache for instance {{(pid=71474) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 21 14:05:46 user nova-compute[71474]: DEBUG nova.compute.manager [req-8c82eaee-cba8-48dd-8508-1d413e78a937 req-0dc0a38a-8277-48c5-b261-493c0f9345cd service nova] [instance: 4e4878d9-5766-433a-9f81-4bb92c369e71] Received event network-changed-a6133e22-45a6-4a63-8885-91a7b439d45b {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:05:46 user nova-compute[71474]: DEBUG nova.compute.manager [req-8c82eaee-cba8-48dd-8508-1d413e78a937 req-0dc0a38a-8277-48c5-b261-493c0f9345cd service nova] [instance: 4e4878d9-5766-433a-9f81-4bb92c369e71] Refreshing instance network info cache due to event network-changed-a6133e22-45a6-4a63-8885-91a7b439d45b. {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 21 14:05:46 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-8c82eaee-cba8-48dd-8508-1d413e78a937 req-0dc0a38a-8277-48c5-b261-493c0f9345cd service nova] Acquiring lock "refresh_cache-4e4878d9-5766-433a-9f81-4bb92c369e71" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 14:05:46 user nova-compute[71474]: DEBUG nova.network.neutron [None req-6d5776b1-8d73-4ca5-b971-6a445d7be8bf tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 4e4878d9-5766-433a-9f81-4bb92c369e71] Instance cache missing network info. {{(pid=71474) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 21 14:05:47 user nova-compute[71474]: DEBUG nova.network.neutron [None req-6d5776b1-8d73-4ca5-b971-6a445d7be8bf tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 4e4878d9-5766-433a-9f81-4bb92c369e71] Updating instance_info_cache with network_info: [{"id": "a6133e22-45a6-4a63-8885-91a7b439d45b", "address": "fa:16:3e:7d:be:aa", "network": {"id": "4b38afb7-2b53-44fc-a4e0-7d79bef71734", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-935140606-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "885cdc1521a14985bfa70ae21e73c693", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6133e22-45", "ovs_interfaceid": "a6133e22-45a6-4a63-8885-91a7b439d45b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:05:47 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-6d5776b1-8d73-4ca5-b971-6a445d7be8bf tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Releasing lock "refresh_cache-4e4878d9-5766-433a-9f81-4bb92c369e71" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 14:05:47 user nova-compute[71474]: DEBUG nova.compute.manager [None req-6d5776b1-8d73-4ca5-b971-6a445d7be8bf tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 4e4878d9-5766-433a-9f81-4bb92c369e71] Instance network_info: |[{"id": "a6133e22-45a6-4a63-8885-91a7b439d45b", "address": "fa:16:3e:7d:be:aa", "network": {"id": "4b38afb7-2b53-44fc-a4e0-7d79bef71734", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-935140606-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "885cdc1521a14985bfa70ae21e73c693", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6133e22-45", "ovs_interfaceid": "a6133e22-45a6-4a63-8885-91a7b439d45b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71474) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 21 14:05:47 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-8c82eaee-cba8-48dd-8508-1d413e78a937 req-0dc0a38a-8277-48c5-b261-493c0f9345cd service nova] Acquired lock "refresh_cache-4e4878d9-5766-433a-9f81-4bb92c369e71" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 14:05:47 user nova-compute[71474]: DEBUG nova.network.neutron [req-8c82eaee-cba8-48dd-8508-1d413e78a937 req-0dc0a38a-8277-48c5-b261-493c0f9345cd service nova] [instance: 4e4878d9-5766-433a-9f81-4bb92c369e71] Refreshing network info cache for port a6133e22-45a6-4a63-8885-91a7b439d45b {{(pid=71474) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 21 14:05:47 user nova-compute[71474]: DEBUG nova.network.neutron [req-8c82eaee-cba8-48dd-8508-1d413e78a937 req-0dc0a38a-8277-48c5-b261-493c0f9345cd service nova] [instance: 4e4878d9-5766-433a-9f81-4bb92c369e71] Updated VIF entry in instance network info cache for port a6133e22-45a6-4a63-8885-91a7b439d45b. {{(pid=71474) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 21 14:05:47 user nova-compute[71474]: DEBUG nova.network.neutron [req-8c82eaee-cba8-48dd-8508-1d413e78a937 req-0dc0a38a-8277-48c5-b261-493c0f9345cd service nova] [instance: 4e4878d9-5766-433a-9f81-4bb92c369e71] Updating instance_info_cache with network_info: [{"id": "a6133e22-45a6-4a63-8885-91a7b439d45b", "address": "fa:16:3e:7d:be:aa", "network": {"id": "4b38afb7-2b53-44fc-a4e0-7d79bef71734", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-935140606-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "885cdc1521a14985bfa70ae21e73c693", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6133e22-45", "ovs_interfaceid": "a6133e22-45a6-4a63-8885-91a7b439d45b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:05:47 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-8c82eaee-cba8-48dd-8508-1d413e78a937 req-0dc0a38a-8277-48c5-b261-493c0f9345cd service nova] Releasing lock "refresh_cache-4e4878d9-5766-433a-9f81-4bb92c369e71" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 14:05:49 user nova-compute[71474]: DEBUG nova.compute.manager [req-7cd6b8b2-a9df-4ebc-9d46-3b988c3d640f req-9da32b3e-8d5a-4ffe-9ad9-176bcb88efd9 service nova] [instance: aac5a363-5528-4d5f-8c90-6f9ad69a06dd] Received event network-changed-451630a7-eec6-4614-8737-498026e8d671 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:05:49 user nova-compute[71474]: DEBUG nova.compute.manager [req-7cd6b8b2-a9df-4ebc-9d46-3b988c3d640f req-9da32b3e-8d5a-4ffe-9ad9-176bcb88efd9 service nova] [instance: aac5a363-5528-4d5f-8c90-6f9ad69a06dd] Refreshing instance network info cache due to event network-changed-451630a7-eec6-4614-8737-498026e8d671. {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 21 14:05:49 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-7cd6b8b2-a9df-4ebc-9d46-3b988c3d640f req-9da32b3e-8d5a-4ffe-9ad9-176bcb88efd9 service nova] Acquiring lock "refresh_cache-aac5a363-5528-4d5f-8c90-6f9ad69a06dd" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 14:05:49 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-7cd6b8b2-a9df-4ebc-9d46-3b988c3d640f req-9da32b3e-8d5a-4ffe-9ad9-176bcb88efd9 service nova] Acquired lock "refresh_cache-aac5a363-5528-4d5f-8c90-6f9ad69a06dd" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 14:05:49 user nova-compute[71474]: DEBUG nova.network.neutron [req-7cd6b8b2-a9df-4ebc-9d46-3b988c3d640f req-9da32b3e-8d5a-4ffe-9ad9-176bcb88efd9 service nova] [instance: aac5a363-5528-4d5f-8c90-6f9ad69a06dd] Refreshing network info cache for port 451630a7-eec6-4614-8737-498026e8d671 {{(pid=71474) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 21 14:05:50 user nova-compute[71474]: DEBUG nova.network.neutron [req-7cd6b8b2-a9df-4ebc-9d46-3b988c3d640f req-9da32b3e-8d5a-4ffe-9ad9-176bcb88efd9 service nova] [instance: aac5a363-5528-4d5f-8c90-6f9ad69a06dd] Updated VIF entry in instance network info cache for port 451630a7-eec6-4614-8737-498026e8d671. {{(pid=71474) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 21 14:05:50 user nova-compute[71474]: DEBUG nova.network.neutron [req-7cd6b8b2-a9df-4ebc-9d46-3b988c3d640f req-9da32b3e-8d5a-4ffe-9ad9-176bcb88efd9 service nova] [instance: aac5a363-5528-4d5f-8c90-6f9ad69a06dd] Updating instance_info_cache with network_info: [{"id": "451630a7-eec6-4614-8737-498026e8d671", "address": "fa:16:3e:16:01:e5", "network": {"id": "31b07b9f-0a0f-426a-97d6-12b23e611818", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1809206062-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.121", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f0ccc2c950364fcbb0f2b1cc937f6a82", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap451630a7-ee", "ovs_interfaceid": "451630a7-eec6-4614-8737-498026e8d671", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:05:50 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-7cd6b8b2-a9df-4ebc-9d46-3b988c3d640f req-9da32b3e-8d5a-4ffe-9ad9-176bcb88efd9 service nova] Releasing lock "refresh_cache-aac5a363-5528-4d5f-8c90-6f9ad69a06dd" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 14:05:50 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:05:51 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-cad82dd2-f9d4-4b57-954c-34583b4b67c5 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Acquiring lock "aac5a363-5528-4d5f-8c90-6f9ad69a06dd" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:05:51 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-cad82dd2-f9d4-4b57-954c-34583b4b67c5 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Lock "aac5a363-5528-4d5f-8c90-6f9ad69a06dd" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.006s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:05:51 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-cad82dd2-f9d4-4b57-954c-34583b4b67c5 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Acquiring lock "aac5a363-5528-4d5f-8c90-6f9ad69a06dd-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:05:51 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-cad82dd2-f9d4-4b57-954c-34583b4b67c5 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Lock "aac5a363-5528-4d5f-8c90-6f9ad69a06dd-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:05:51 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-cad82dd2-f9d4-4b57-954c-34583b4b67c5 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Lock "aac5a363-5528-4d5f-8c90-6f9ad69a06dd-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:05:51 user nova-compute[71474]: INFO nova.compute.manager [None req-cad82dd2-f9d4-4b57-954c-34583b4b67c5 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: aac5a363-5528-4d5f-8c90-6f9ad69a06dd] Terminating instance Apr 21 14:05:51 user nova-compute[71474]: DEBUG nova.compute.manager [None req-cad82dd2-f9d4-4b57-954c-34583b4b67c5 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: aac5a363-5528-4d5f-8c90-6f9ad69a06dd] Start destroying the instance on the hypervisor. {{(pid=71474) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 21 14:05:51 user nova-compute[71474]: WARNING nova.compute.manager [None req-6d5776b1-8d73-4ca5-b971-6a445d7be8bf tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Volume id: a4c5d5fc-5b50-49d2-8708-6dc4294bc556 finished being created but its status is error. Apr 21 14:05:51 user nova-compute[71474]: ERROR nova.compute.manager [None req-6d5776b1-8d73-4ca5-b971-6a445d7be8bf tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 4e4878d9-5766-433a-9f81-4bb92c369e71] Instance failed block device setup: nova.exception.VolumeNotCreated: Volume a4c5d5fc-5b50-49d2-8708-6dc4294bc556 did not finish being created even after we waited 5 seconds or 2 attempts. And its status is error. Apr 21 14:05:51 user nova-compute[71474]: ERROR nova.compute.manager [instance: 4e4878d9-5766-433a-9f81-4bb92c369e71] Traceback (most recent call last): Apr 21 14:05:51 user nova-compute[71474]: ERROR nova.compute.manager [instance: 4e4878d9-5766-433a-9f81-4bb92c369e71] File "/opt/stack/nova/nova/compute/manager.py", line 2175, in _prep_block_device Apr 21 14:05:51 user nova-compute[71474]: ERROR nova.compute.manager [instance: 4e4878d9-5766-433a-9f81-4bb92c369e71] driver_block_device.attach_block_devices( Apr 21 14:05:51 user nova-compute[71474]: ERROR nova.compute.manager [instance: 4e4878d9-5766-433a-9f81-4bb92c369e71] File "/opt/stack/nova/nova/virt/block_device.py", line 936, in attach_block_devices Apr 21 14:05:51 user nova-compute[71474]: ERROR nova.compute.manager [instance: 4e4878d9-5766-433a-9f81-4bb92c369e71] _log_and_attach(device) Apr 21 14:05:51 user nova-compute[71474]: ERROR nova.compute.manager [instance: 4e4878d9-5766-433a-9f81-4bb92c369e71] File "/opt/stack/nova/nova/virt/block_device.py", line 933, in _log_and_attach Apr 21 14:05:51 user nova-compute[71474]: ERROR nova.compute.manager [instance: 4e4878d9-5766-433a-9f81-4bb92c369e71] bdm.attach(*attach_args, **attach_kwargs) Apr 21 14:05:51 user nova-compute[71474]: ERROR nova.compute.manager [instance: 4e4878d9-5766-433a-9f81-4bb92c369e71] File "/opt/stack/nova/nova/virt/block_device.py", line 831, in attach Apr 21 14:05:51 user nova-compute[71474]: ERROR nova.compute.manager [instance: 4e4878d9-5766-433a-9f81-4bb92c369e71] self.volume_id, self.attachment_id = self._create_volume( Apr 21 14:05:51 user nova-compute[71474]: ERROR nova.compute.manager [instance: 4e4878d9-5766-433a-9f81-4bb92c369e71] File "/opt/stack/nova/nova/virt/block_device.py", line 435, in _create_volume Apr 21 14:05:51 user nova-compute[71474]: ERROR nova.compute.manager [instance: 4e4878d9-5766-433a-9f81-4bb92c369e71] self._call_wait_func(context, wait_func, volume_api, vol['id']) Apr 21 14:05:51 user nova-compute[71474]: ERROR nova.compute.manager [instance: 4e4878d9-5766-433a-9f81-4bb92c369e71] File "/opt/stack/nova/nova/virt/block_device.py", line 785, in _call_wait_func Apr 21 14:05:51 user nova-compute[71474]: ERROR nova.compute.manager [instance: 4e4878d9-5766-433a-9f81-4bb92c369e71] with excutils.save_and_reraise_exception(): Apr 21 14:05:51 user nova-compute[71474]: ERROR nova.compute.manager [instance: 4e4878d9-5766-433a-9f81-4bb92c369e71] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ Apr 21 14:05:51 user nova-compute[71474]: ERROR nova.compute.manager [instance: 4e4878d9-5766-433a-9f81-4bb92c369e71] self.force_reraise() Apr 21 14:05:51 user nova-compute[71474]: ERROR nova.compute.manager [instance: 4e4878d9-5766-433a-9f81-4bb92c369e71] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise Apr 21 14:05:51 user nova-compute[71474]: ERROR nova.compute.manager [instance: 4e4878d9-5766-433a-9f81-4bb92c369e71] raise self.value Apr 21 14:05:51 user nova-compute[71474]: ERROR nova.compute.manager [instance: 4e4878d9-5766-433a-9f81-4bb92c369e71] File "/opt/stack/nova/nova/virt/block_device.py", line 783, in _call_wait_func Apr 21 14:05:51 user nova-compute[71474]: ERROR nova.compute.manager [instance: 4e4878d9-5766-433a-9f81-4bb92c369e71] wait_func(context, volume_id) Apr 21 14:05:51 user nova-compute[71474]: ERROR nova.compute.manager [instance: 4e4878d9-5766-433a-9f81-4bb92c369e71] File "/opt/stack/nova/nova/compute/manager.py", line 1792, in _await_block_device_map_created Apr 21 14:05:51 user nova-compute[71474]: ERROR nova.compute.manager [instance: 4e4878d9-5766-433a-9f81-4bb92c369e71] raise exception.VolumeNotCreated(volume_id=vol_id, Apr 21 14:05:51 user nova-compute[71474]: ERROR nova.compute.manager [instance: 4e4878d9-5766-433a-9f81-4bb92c369e71] nova.exception.VolumeNotCreated: Volume a4c5d5fc-5b50-49d2-8708-6dc4294bc556 did not finish being created even after we waited 5 seconds or 2 attempts. And its status is error. Apr 21 14:05:51 user nova-compute[71474]: ERROR nova.compute.manager [instance: 4e4878d9-5766-433a-9f81-4bb92c369e71] Apr 21 14:05:51 user nova-compute[71474]: DEBUG nova.compute.claims [None req-6d5776b1-8d73-4ca5-b971-6a445d7be8bf tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 4e4878d9-5766-433a-9f81-4bb92c369e71] Aborting claim: {{(pid=71474) abort /opt/stack/nova/nova/compute/claims.py:84}} Apr 21 14:05:51 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-6d5776b1-8d73-4ca5-b971-6a445d7be8bf tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:05:51 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-6d5776b1-8d73-4ca5-b971-6a445d7be8bf tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:05:51 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:05:51 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:05:51 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:05:51 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:05:51 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-6d5776b1-8d73-4ca5-b971-6a445d7be8bf tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Inventory has not changed in ProviderTree for provider: 4e62c1ab-67bb-43ed-8389-61deb50e98d7 {{(pid=71474) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 14:05:51 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-6d5776b1-8d73-4ca5-b971-6a445d7be8bf tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Inventory has not changed for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 14:05:51 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-6d5776b1-8d73-4ca5-b971-6a445d7be8bf tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.404s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:05:51 user nova-compute[71474]: DEBUG nova.compute.manager [None req-6d5776b1-8d73-4ca5-b971-6a445d7be8bf tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 4e4878d9-5766-433a-9f81-4bb92c369e71] Build of instance 4e4878d9-5766-433a-9f81-4bb92c369e71 aborted: Volume a4c5d5fc-5b50-49d2-8708-6dc4294bc556 did not finish being created even after we waited 5 seconds or 2 attempts. And its status is error. {{(pid=71474) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2636}} Apr 21 14:05:51 user nova-compute[71474]: DEBUG nova.compute.utils [None req-6d5776b1-8d73-4ca5-b971-6a445d7be8bf tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 4e4878d9-5766-433a-9f81-4bb92c369e71] Build of instance 4e4878d9-5766-433a-9f81-4bb92c369e71 aborted: Volume a4c5d5fc-5b50-49d2-8708-6dc4294bc556 did not finish being created even after we waited 5 seconds or 2 attempts. And its status is error. {{(pid=71474) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} Apr 21 14:05:51 user nova-compute[71474]: ERROR nova.compute.manager [None req-6d5776b1-8d73-4ca5-b971-6a445d7be8bf tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 4e4878d9-5766-433a-9f81-4bb92c369e71] Build of instance 4e4878d9-5766-433a-9f81-4bb92c369e71 aborted: Volume a4c5d5fc-5b50-49d2-8708-6dc4294bc556 did not finish being created even after we waited 5 seconds or 2 attempts. And its status is error.: nova.exception.BuildAbortException: Build of instance 4e4878d9-5766-433a-9f81-4bb92c369e71 aborted: Volume a4c5d5fc-5b50-49d2-8708-6dc4294bc556 did not finish being created even after we waited 5 seconds or 2 attempts. And its status is error. Apr 21 14:05:51 user nova-compute[71474]: DEBUG nova.compute.manager [None req-6d5776b1-8d73-4ca5-b971-6a445d7be8bf tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 4e4878d9-5766-433a-9f81-4bb92c369e71] Unplugging VIFs for instance {{(pid=71474) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2961}} Apr 21 14:05:51 user nova-compute[71474]: DEBUG nova.virt.libvirt.vif [None req-6d5776b1-8d73-4ca5-b971-6a445d7be8bf tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T14:05:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-1205999674',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverbootfromvolumestablerescuetest-server-1205999674',id=21,image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='885cdc1521a14985bfa70ae21e73c693',ramdisk_id='',reservation_id='r-7ml09ung',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-28514522',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member'},tags=TagList,task_state='block_device_mapping',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T14:05:45Z,user_data=None,user_id='132913991f8c45c1adaf5db7ef7cea30',uuid=4e4878d9-5766-433a-9f81-4bb92c369e71,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a6133e22-45a6-4a63-8885-91a7b439d45b", "address": "fa:16:3e:7d:be:aa", "network": {"id": "4b38afb7-2b53-44fc-a4e0-7d79bef71734", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-935140606-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "885cdc1521a14985bfa70ae21e73c693", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6133e22-45", "ovs_interfaceid": "a6133e22-45a6-4a63-8885-91a7b439d45b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 21 14:05:51 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-6d5776b1-8d73-4ca5-b971-6a445d7be8bf tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Converting VIF {"id": "a6133e22-45a6-4a63-8885-91a7b439d45b", "address": "fa:16:3e:7d:be:aa", "network": {"id": "4b38afb7-2b53-44fc-a4e0-7d79bef71734", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-935140606-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "885cdc1521a14985bfa70ae21e73c693", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6133e22-45", "ovs_interfaceid": "a6133e22-45a6-4a63-8885-91a7b439d45b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 14:05:51 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-6d5776b1-8d73-4ca5-b971-6a445d7be8bf tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7d:be:aa,bridge_name='br-int',has_traffic_filtering=True,id=a6133e22-45a6-4a63-8885-91a7b439d45b,network=Network(4b38afb7-2b53-44fc-a4e0-7d79bef71734),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa6133e22-45') {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 14:05:51 user nova-compute[71474]: DEBUG os_vif [None req-6d5776b1-8d73-4ca5-b971-6a445d7be8bf tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7d:be:aa,bridge_name='br-int',has_traffic_filtering=True,id=a6133e22-45a6-4a63-8885-91a7b439d45b,network=Network(4b38afb7-2b53-44fc-a4e0-7d79bef71734),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa6133e22-45') {{(pid=71474) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 21 14:05:51 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:05:51 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa6133e22-45, bridge=br-int, if_exists=True) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 14:05:51 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 21 14:05:51 user nova-compute[71474]: INFO os_vif [None req-6d5776b1-8d73-4ca5-b971-6a445d7be8bf tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7d:be:aa,bridge_name='br-int',has_traffic_filtering=True,id=a6133e22-45a6-4a63-8885-91a7b439d45b,network=Network(4b38afb7-2b53-44fc-a4e0-7d79bef71734),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa6133e22-45') Apr 21 14:05:51 user nova-compute[71474]: DEBUG nova.compute.manager [None req-6d5776b1-8d73-4ca5-b971-6a445d7be8bf tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 4e4878d9-5766-433a-9f81-4bb92c369e71] Unplugged VIFs for instance {{(pid=71474) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} Apr 21 14:05:51 user nova-compute[71474]: DEBUG nova.compute.manager [None req-6d5776b1-8d73-4ca5-b971-6a445d7be8bf tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 4e4878d9-5766-433a-9f81-4bb92c369e71] Deallocating network for instance {{(pid=71474) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 21 14:05:51 user nova-compute[71474]: DEBUG nova.network.neutron [None req-6d5776b1-8d73-4ca5-b971-6a445d7be8bf tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 4e4878d9-5766-433a-9f81-4bb92c369e71] deallocate_for_instance() {{(pid=71474) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 21 14:05:51 user nova-compute[71474]: DEBUG nova.compute.manager [req-dd88e5d9-0f83-4fa3-929a-c6cf1425dba8 req-de9bffd0-3dee-457f-ac80-516320179c65 service nova] [instance: aac5a363-5528-4d5f-8c90-6f9ad69a06dd] Received event network-vif-unplugged-451630a7-eec6-4614-8737-498026e8d671 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:05:51 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-dd88e5d9-0f83-4fa3-929a-c6cf1425dba8 req-de9bffd0-3dee-457f-ac80-516320179c65 service nova] Acquiring lock "aac5a363-5528-4d5f-8c90-6f9ad69a06dd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:05:51 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-dd88e5d9-0f83-4fa3-929a-c6cf1425dba8 req-de9bffd0-3dee-457f-ac80-516320179c65 service nova] Lock "aac5a363-5528-4d5f-8c90-6f9ad69a06dd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:05:51 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-dd88e5d9-0f83-4fa3-929a-c6cf1425dba8 req-de9bffd0-3dee-457f-ac80-516320179c65 service nova] Lock "aac5a363-5528-4d5f-8c90-6f9ad69a06dd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:05:51 user nova-compute[71474]: DEBUG nova.compute.manager [req-dd88e5d9-0f83-4fa3-929a-c6cf1425dba8 req-de9bffd0-3dee-457f-ac80-516320179c65 service nova] [instance: aac5a363-5528-4d5f-8c90-6f9ad69a06dd] No waiting events found dispatching network-vif-unplugged-451630a7-eec6-4614-8737-498026e8d671 {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:05:51 user nova-compute[71474]: DEBUG nova.compute.manager [req-dd88e5d9-0f83-4fa3-929a-c6cf1425dba8 req-de9bffd0-3dee-457f-ac80-516320179c65 service nova] [instance: aac5a363-5528-4d5f-8c90-6f9ad69a06dd] Received event network-vif-unplugged-451630a7-eec6-4614-8737-498026e8d671 for instance with task_state deleting. {{(pid=71474) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 21 14:05:51 user nova-compute[71474]: DEBUG nova.compute.manager [req-dd88e5d9-0f83-4fa3-929a-c6cf1425dba8 req-de9bffd0-3dee-457f-ac80-516320179c65 service nova] [instance: aac5a363-5528-4d5f-8c90-6f9ad69a06dd] Received event network-vif-plugged-451630a7-eec6-4614-8737-498026e8d671 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:05:51 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-dd88e5d9-0f83-4fa3-929a-c6cf1425dba8 req-de9bffd0-3dee-457f-ac80-516320179c65 service nova] Acquiring lock "aac5a363-5528-4d5f-8c90-6f9ad69a06dd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:05:51 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-dd88e5d9-0f83-4fa3-929a-c6cf1425dba8 req-de9bffd0-3dee-457f-ac80-516320179c65 service nova] Lock "aac5a363-5528-4d5f-8c90-6f9ad69a06dd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:05:51 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-dd88e5d9-0f83-4fa3-929a-c6cf1425dba8 req-de9bffd0-3dee-457f-ac80-516320179c65 service nova] Lock "aac5a363-5528-4d5f-8c90-6f9ad69a06dd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:05:51 user nova-compute[71474]: DEBUG nova.compute.manager [req-dd88e5d9-0f83-4fa3-929a-c6cf1425dba8 req-de9bffd0-3dee-457f-ac80-516320179c65 service nova] [instance: aac5a363-5528-4d5f-8c90-6f9ad69a06dd] No waiting events found dispatching network-vif-plugged-451630a7-eec6-4614-8737-498026e8d671 {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:05:51 user nova-compute[71474]: WARNING nova.compute.manager [req-dd88e5d9-0f83-4fa3-929a-c6cf1425dba8 req-de9bffd0-3dee-457f-ac80-516320179c65 service nova] [instance: aac5a363-5528-4d5f-8c90-6f9ad69a06dd] Received unexpected event network-vif-plugged-451630a7-eec6-4614-8737-498026e8d671 for instance with vm_state active and task_state deleting. Apr 21 14:05:51 user nova-compute[71474]: INFO nova.virt.libvirt.driver [-] [instance: aac5a363-5528-4d5f-8c90-6f9ad69a06dd] Instance destroyed successfully. Apr 21 14:05:51 user nova-compute[71474]: DEBUG nova.objects.instance [None req-cad82dd2-f9d4-4b57-954c-34583b4b67c5 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Lazy-loading 'resources' on Instance uuid aac5a363-5528-4d5f-8c90-6f9ad69a06dd {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 14:05:51 user nova-compute[71474]: DEBUG nova.virt.libvirt.vif [None req-cad82dd2-f9d4-4b57-954c-34583b4b67c5 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T14:03:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-2067580371',display_name='tempest-AttachVolumeNegativeTest-server-2067580371',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-2067580371',id=18,image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFlxdy5PoliBeRnt4tWYC7Fu7pkGOage/HG8uxLBIgt7DEFt1QHK8dwArdP14y447xyPamNSB8z6pOZhh9qz3WsCy+knrmjpHD2UmoJzo5C/B6YDf9z6px5GYzSzQaO+nA==',key_name='tempest-keypair-500746232',keypairs=,launch_index=0,launched_at=2023-04-21T14:04:04Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='f0ccc2c950364fcbb0f2b1cc937f6a82',ramdisk_id='',reservation_id='r-uma60tll',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-AttachVolumeNegativeTest-166063504',owner_user_name='tempest-AttachVolumeNegativeTest-166063504-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-21T14:04:05Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='ab1d2ed7df2f4a9bbf14da7e2c5fece2',uuid=aac5a363-5528-4d5f-8c90-6f9ad69a06dd,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "451630a7-eec6-4614-8737-498026e8d671", "address": "fa:16:3e:16:01:e5", "network": {"id": "31b07b9f-0a0f-426a-97d6-12b23e611818", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1809206062-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.121", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f0ccc2c950364fcbb0f2b1cc937f6a82", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap451630a7-ee", "ovs_interfaceid": "451630a7-eec6-4614-8737-498026e8d671", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 21 14:05:51 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-cad82dd2-f9d4-4b57-954c-34583b4b67c5 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Converting VIF {"id": "451630a7-eec6-4614-8737-498026e8d671", "address": "fa:16:3e:16:01:e5", "network": {"id": "31b07b9f-0a0f-426a-97d6-12b23e611818", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1809206062-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.121", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f0ccc2c950364fcbb0f2b1cc937f6a82", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap451630a7-ee", "ovs_interfaceid": "451630a7-eec6-4614-8737-498026e8d671", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 14:05:51 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-cad82dd2-f9d4-4b57-954c-34583b4b67c5 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:16:01:e5,bridge_name='br-int',has_traffic_filtering=True,id=451630a7-eec6-4614-8737-498026e8d671,network=Network(31b07b9f-0a0f-426a-97d6-12b23e611818),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap451630a7-ee') {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 14:05:51 user nova-compute[71474]: DEBUG os_vif [None req-cad82dd2-f9d4-4b57-954c-34583b4b67c5 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:16:01:e5,bridge_name='br-int',has_traffic_filtering=True,id=451630a7-eec6-4614-8737-498026e8d671,network=Network(31b07b9f-0a0f-426a-97d6-12b23e611818),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap451630a7-ee') {{(pid=71474) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 21 14:05:51 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:05:51 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap451630a7-ee, bridge=br-int, if_exists=True) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 14:05:51 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:05:51 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 14:05:51 user nova-compute[71474]: INFO os_vif [None req-cad82dd2-f9d4-4b57-954c-34583b4b67c5 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:16:01:e5,bridge_name='br-int',has_traffic_filtering=True,id=451630a7-eec6-4614-8737-498026e8d671,network=Network(31b07b9f-0a0f-426a-97d6-12b23e611818),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap451630a7-ee') Apr 21 14:05:51 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-cad82dd2-f9d4-4b57-954c-34583b4b67c5 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: aac5a363-5528-4d5f-8c90-6f9ad69a06dd] Deleting instance files /opt/stack/data/nova/instances/aac5a363-5528-4d5f-8c90-6f9ad69a06dd_del Apr 21 14:05:51 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-cad82dd2-f9d4-4b57-954c-34583b4b67c5 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: aac5a363-5528-4d5f-8c90-6f9ad69a06dd] Deletion of /opt/stack/data/nova/instances/aac5a363-5528-4d5f-8c90-6f9ad69a06dd_del complete Apr 21 14:05:51 user nova-compute[71474]: INFO nova.compute.manager [None req-cad82dd2-f9d4-4b57-954c-34583b4b67c5 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: aac5a363-5528-4d5f-8c90-6f9ad69a06dd] Took 0.87 seconds to destroy the instance on the hypervisor. Apr 21 14:05:51 user nova-compute[71474]: DEBUG oslo.service.loopingcall [None req-cad82dd2-f9d4-4b57-954c-34583b4b67c5 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71474) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 21 14:05:51 user nova-compute[71474]: DEBUG nova.compute.manager [-] [instance: aac5a363-5528-4d5f-8c90-6f9ad69a06dd] Deallocating network for instance {{(pid=71474) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 21 14:05:51 user nova-compute[71474]: DEBUG nova.network.neutron [-] [instance: aac5a363-5528-4d5f-8c90-6f9ad69a06dd] deallocate_for_instance() {{(pid=71474) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 21 14:05:52 user nova-compute[71474]: DEBUG nova.network.neutron [None req-6d5776b1-8d73-4ca5-b971-6a445d7be8bf tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 4e4878d9-5766-433a-9f81-4bb92c369e71] Updating instance_info_cache with network_info: [] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:05:52 user nova-compute[71474]: INFO nova.compute.manager [None req-6d5776b1-8d73-4ca5-b971-6a445d7be8bf tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 4e4878d9-5766-433a-9f81-4bb92c369e71] Took 0.54 seconds to deallocate network for instance. Apr 21 14:05:52 user nova-compute[71474]: INFO nova.scheduler.client.report [None req-6d5776b1-8d73-4ca5-b971-6a445d7be8bf tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Deleted allocations for instance 4e4878d9-5766-433a-9f81-4bb92c369e71 Apr 21 14:05:52 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-6d5776b1-8d73-4ca5-b971-6a445d7be8bf tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Lock "4e4878d9-5766-433a-9f81-4bb92c369e71" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 7.383s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:05:52 user nova-compute[71474]: DEBUG nova.network.neutron [-] [instance: aac5a363-5528-4d5f-8c90-6f9ad69a06dd] Updating instance_info_cache with network_info: [] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:05:52 user nova-compute[71474]: INFO nova.compute.manager [-] [instance: aac5a363-5528-4d5f-8c90-6f9ad69a06dd] Took 0.71 seconds to deallocate network for instance. Apr 21 14:05:52 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-cad82dd2-f9d4-4b57-954c-34583b4b67c5 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:05:52 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-cad82dd2-f9d4-4b57-954c-34583b4b67c5 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:05:52 user nova-compute[71474]: DEBUG nova.compute.manager [req-1df6f219-d538-4fb1-a45e-97ff76a62524 req-41e964b1-2e8b-4d51-b1e9-d5e71c545c13 service nova] [instance: aac5a363-5528-4d5f-8c90-6f9ad69a06dd] Received event network-vif-deleted-451630a7-eec6-4614-8737-498026e8d671 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:05:52 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-cad82dd2-f9d4-4b57-954c-34583b4b67c5 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Inventory has not changed in ProviderTree for provider: 4e62c1ab-67bb-43ed-8389-61deb50e98d7 {{(pid=71474) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 14:05:52 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-cad82dd2-f9d4-4b57-954c-34583b4b67c5 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Inventory has not changed for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 14:05:52 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-cad82dd2-f9d4-4b57-954c-34583b4b67c5 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.287s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:05:53 user nova-compute[71474]: INFO nova.scheduler.client.report [None req-cad82dd2-f9d4-4b57-954c-34583b4b67c5 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Deleted allocations for instance aac5a363-5528-4d5f-8c90-6f9ad69a06dd Apr 21 14:05:53 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-cad82dd2-f9d4-4b57-954c-34583b4b67c5 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Lock "aac5a363-5528-4d5f-8c90-6f9ad69a06dd" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.232s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:05:55 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:05:55 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:05:55 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:05:55 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:05:55 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:05:55 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:05:55 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Auditing locally available compute resources for user (node: user) {{(pid=71474) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 21 14:05:55 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4a44d9f3-28b2-45e7-b952-2bb1735ef5b5/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:05:55 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:05:56 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4a44d9f3-28b2-45e7-b952-2bb1735ef5b5/disk --force-share --output=json" returned: 0 in 0.132s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:05:56 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4a44d9f3-28b2-45e7-b952-2bb1735ef5b5/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:05:56 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4a44d9f3-28b2-45e7-b952-2bb1735ef5b5/disk --force-share --output=json" returned: 0 in 0.133s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:05:56 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/80eb182f-948b-42d3-999b-339c5d615a73/disk.rescue --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:05:56 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/80eb182f-948b-42d3-999b-339c5d615a73/disk.rescue --force-share --output=json" returned: 0 in 0.133s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:05:56 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/80eb182f-948b-42d3-999b-339c5d615a73/disk.rescue --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:05:56 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/80eb182f-948b-42d3-999b-339c5d615a73/disk.rescue --force-share --output=json" returned: 0 in 0.135s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:05:56 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/80eb182f-948b-42d3-999b-339c5d615a73/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:05:56 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/80eb182f-948b-42d3-999b-339c5d615a73/disk --force-share --output=json" returned: 0 in 0.132s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:05:56 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/80eb182f-948b-42d3-999b-339c5d615a73/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:05:56 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/80eb182f-948b-42d3-999b-339c5d615a73/disk --force-share --output=json" returned: 0 in 0.138s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:05:56 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a205a2a4-c0de-4c5c-abc4-7b034070e014/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:05:56 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:05:56 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a205a2a4-c0de-4c5c-abc4-7b034070e014/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:05:56 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a205a2a4-c0de-4c5c-abc4-7b034070e014/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:05:57 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a205a2a4-c0de-4c5c-abc4-7b034070e014/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:05:57 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5cf0c20f-ffda-4578-adae-9aaef4c4bd18/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:05:57 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5cf0c20f-ffda-4578-adae-9aaef4c4bd18/disk --force-share --output=json" returned: 0 in 0.132s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:05:57 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5cf0c20f-ffda-4578-adae-9aaef4c4bd18/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:05:57 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5cf0c20f-ffda-4578-adae-9aaef4c4bd18/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:05:57 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b5e2e065-1b7d-4cbf-b31a-923ae2f92fff/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:05:57 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b5e2e065-1b7d-4cbf-b31a-923ae2f92fff/disk --force-share --output=json" returned: 0 in 0.149s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:05:57 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b5e2e065-1b7d-4cbf-b31a-923ae2f92fff/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:05:57 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b5e2e065-1b7d-4cbf-b31a-923ae2f92fff/disk --force-share --output=json" returned: 0 in 0.149s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:05:57 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5e502c4c-a46b-4670-acba-2fda2d05adf5/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:05:57 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5e502c4c-a46b-4670-acba-2fda2d05adf5/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:05:57 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5e502c4c-a46b-4670-acba-2fda2d05adf5/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:05:57 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5e502c4c-a46b-4670-acba-2fda2d05adf5/disk --force-share --output=json" returned: 0 in 0.131s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:05:57 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/30068c4a-94ed-4b84-9178-0d554326fc68/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:05:58 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/30068c4a-94ed-4b84-9178-0d554326fc68/disk --force-share --output=json" returned: 0 in 0.142s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:05:58 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/30068c4a-94ed-4b84-9178-0d554326fc68/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:05:58 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/30068c4a-94ed-4b84-9178-0d554326fc68/disk --force-share --output=json" returned: 0 in 0.144s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:05:58 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 14:05:58 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 14:05:58 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Hypervisor/Node resource view: name=user free_ram=8324MB free_disk=25.93378448486328GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71474) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 21 14:05:58 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:05:58 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:05:58 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Instance 30068c4a-94ed-4b84-9178-0d554326fc68 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71474) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 14:05:58 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Instance 5e502c4c-a46b-4670-acba-2fda2d05adf5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71474) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 14:05:58 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Instance 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71474) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 14:05:58 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Instance a205a2a4-c0de-4c5c-abc4-7b034070e014 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71474) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 14:05:58 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Instance 80eb182f-948b-42d3-999b-339c5d615a73 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71474) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 14:05:58 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Instance 5cf0c20f-ffda-4578-adae-9aaef4c4bd18 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71474) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 14:05:58 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Instance b5e2e065-1b7d-4cbf-b31a-923ae2f92fff actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71474) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 14:05:58 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Total usable vcpus: 12, total allocated vcpus: 7 {{(pid=71474) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 21 14:05:58 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Final resource view: name=user phys_ram=16023MB used_ram=1408MB phys_disk=40GB used_disk=7GB total_vcpus=12 used_vcpus=7 pci_stats=[] {{(pid=71474) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 21 14:05:58 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Inventory has not changed in ProviderTree for provider: 4e62c1ab-67bb-43ed-8389-61deb50e98d7 {{(pid=71474) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 14:05:58 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Inventory has not changed for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 14:05:59 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Compute_service record updated for user:user {{(pid=71474) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 21 14:05:59 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.348s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:06:01 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:06:01 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:06:01 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:06:01 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Starting heal instance info cache {{(pid=71474) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 21 14:06:01 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Rebuilding the list of instances to heal {{(pid=71474) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 21 14:06:01 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquiring lock "refresh_cache-30068c4a-94ed-4b84-9178-0d554326fc68" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 14:06:01 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquired lock "refresh_cache-30068c4a-94ed-4b84-9178-0d554326fc68" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 14:06:01 user nova-compute[71474]: DEBUG nova.network.neutron [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] Forcefully refreshing network info cache for instance {{(pid=71474) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 21 14:06:01 user nova-compute[71474]: DEBUG nova.objects.instance [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lazy-loading 'info_cache' on Instance uuid 30068c4a-94ed-4b84-9178-0d554326fc68 {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 14:06:01 user nova-compute[71474]: DEBUG nova.network.neutron [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] Updating instance_info_cache with network_info: [{"id": "7361228d-9a8e-4921-9cb8-fc59a0a45063", "address": "fa:16:3e:3c:01:3d", "network": {"id": "d567294b-c36b-4268-af90-17560e0c43e4", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1033838809-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a8c210480b33473c91156b798bcbd8b2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7361228d-9a", "ovs_interfaceid": "7361228d-9a8e-4921-9cb8-fc59a0a45063", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:06:01 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Releasing lock "refresh_cache-30068c4a-94ed-4b84-9178-0d554326fc68" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 14:06:01 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] Updated the network info_cache for instance {{(pid=71474) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 21 14:06:01 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:06:01 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:06:01 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:06:02 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:06:02 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:06:02 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71474) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 21 14:06:03 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:06:06 user nova-compute[71474]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71474) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 14:06:06 user nova-compute[71474]: INFO nova.compute.manager [-] [instance: aac5a363-5528-4d5f-8c90-6f9ad69a06dd] VM Stopped (Lifecycle Event) Apr 21 14:06:06 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f773ffd-a3aa-45f7-a867-ead16afa5a5f None None] [instance: aac5a363-5528-4d5f-8c90-6f9ad69a06dd] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 14:06:06 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:06:11 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 14:06:16 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:06:16 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:06:21 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 14:06:21 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:06:21 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe {{(pid=71474) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 21 14:06:21 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71474) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 14:06:21 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71474) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 14:06:21 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:06:26 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:06:26 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:06:27 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-993420df-7193-4cf6-ae38-9b0897b85317 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Acquiring lock "5cf0c20f-ffda-4578-adae-9aaef4c4bd18" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:06:27 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-993420df-7193-4cf6-ae38-9b0897b85317 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Lock "5cf0c20f-ffda-4578-adae-9aaef4c4bd18" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:06:27 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-993420df-7193-4cf6-ae38-9b0897b85317 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Acquiring lock "5cf0c20f-ffda-4578-adae-9aaef4c4bd18-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:06:27 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-993420df-7193-4cf6-ae38-9b0897b85317 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Lock "5cf0c20f-ffda-4578-adae-9aaef4c4bd18-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:06:27 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-993420df-7193-4cf6-ae38-9b0897b85317 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Lock "5cf0c20f-ffda-4578-adae-9aaef4c4bd18-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:06:27 user nova-compute[71474]: INFO nova.compute.manager [None req-993420df-7193-4cf6-ae38-9b0897b85317 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] [instance: 5cf0c20f-ffda-4578-adae-9aaef4c4bd18] Terminating instance Apr 21 14:06:27 user nova-compute[71474]: DEBUG nova.compute.manager [None req-993420df-7193-4cf6-ae38-9b0897b85317 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] [instance: 5cf0c20f-ffda-4578-adae-9aaef4c4bd18] Start destroying the instance on the hypervisor. {{(pid=71474) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 21 14:06:27 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:06:27 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:06:27 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:06:27 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:06:27 user nova-compute[71474]: DEBUG nova.compute.manager [req-be057b73-bddd-42a2-823e-91c6cdd1d52a req-bdc113ac-3028-42df-9445-f6f4f4e3ec2a service nova] [instance: 5cf0c20f-ffda-4578-adae-9aaef4c4bd18] Received event network-vif-unplugged-c4696818-c28f-4798-a5c4-1a4b64a5a79f {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:06:27 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-be057b73-bddd-42a2-823e-91c6cdd1d52a req-bdc113ac-3028-42df-9445-f6f4f4e3ec2a service nova] Acquiring lock "5cf0c20f-ffda-4578-adae-9aaef4c4bd18-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:06:27 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-be057b73-bddd-42a2-823e-91c6cdd1d52a req-bdc113ac-3028-42df-9445-f6f4f4e3ec2a service nova] Lock "5cf0c20f-ffda-4578-adae-9aaef4c4bd18-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:06:27 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-be057b73-bddd-42a2-823e-91c6cdd1d52a req-bdc113ac-3028-42df-9445-f6f4f4e3ec2a service nova] Lock "5cf0c20f-ffda-4578-adae-9aaef4c4bd18-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:06:27 user nova-compute[71474]: DEBUG nova.compute.manager [req-be057b73-bddd-42a2-823e-91c6cdd1d52a req-bdc113ac-3028-42df-9445-f6f4f4e3ec2a service nova] [instance: 5cf0c20f-ffda-4578-adae-9aaef4c4bd18] No waiting events found dispatching network-vif-unplugged-c4696818-c28f-4798-a5c4-1a4b64a5a79f {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:06:27 user nova-compute[71474]: DEBUG nova.compute.manager [req-be057b73-bddd-42a2-823e-91c6cdd1d52a req-bdc113ac-3028-42df-9445-f6f4f4e3ec2a service nova] [instance: 5cf0c20f-ffda-4578-adae-9aaef4c4bd18] Received event network-vif-unplugged-c4696818-c28f-4798-a5c4-1a4b64a5a79f for instance with task_state deleting. {{(pid=71474) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 21 14:06:27 user nova-compute[71474]: INFO nova.virt.libvirt.driver [-] [instance: 5cf0c20f-ffda-4578-adae-9aaef4c4bd18] Instance destroyed successfully. Apr 21 14:06:27 user nova-compute[71474]: DEBUG nova.objects.instance [None req-993420df-7193-4cf6-ae38-9b0897b85317 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Lazy-loading 'resources' on Instance uuid 5cf0c20f-ffda-4578-adae-9aaef4c4bd18 {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 14:06:27 user nova-compute[71474]: DEBUG nova.virt.libvirt.vif [None req-993420df-7193-4cf6-ae38-9b0897b85317 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T14:04:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-TestMinimumBasicScenario-server-478405333',display_name='tempest-TestMinimumBasicScenario-server-478405333',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-testminimumbasicscenario-server-478405333',id=19,image_ref='a08ba1b8-74ec-4c3c-9d31-0cd58b006bb0',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBy5F7/eY34kH4kknTNvubOiNWmdv324rEiVr8ZZ6u/8wGu10U4U/vV+TgZkfkWQO0m1rbrGO251QOQqyVSfRO8QzK8Jq0lU+/cWevf7A1waDImolju4hBpvNELhkWYjog==',key_name='tempest-TestMinimumBasicScenario-414912251',keypairs=,launch_index=0,launched_at=2023-04-21T14:04:42Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='cfa1f4e6f7864477b911420ea2ecb982',ramdisk_id='',reservation_id='r-wkygfj0w',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a08ba1b8-74ec-4c3c-9d31-0cd58b006bb0',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestMinimumBasicScenario-515927679',owner_user_name='tempest-TestMinimumBasicScenario-515927679-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-21T14:04:43Z,user_data=None,user_id='9d40cdc3312b43d286d8a79cde9f5418',uuid=5cf0c20f-ffda-4578-adae-9aaef4c4bd18,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c4696818-c28f-4798-a5c4-1a4b64a5a79f", "address": "fa:16:3e:97:2a:26", "network": {"id": "12b23d1e-f3a6-4c34-989d-1c89ee946e24", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1972551477-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cfa1f4e6f7864477b911420ea2ecb982", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapc4696818-c2", "ovs_interfaceid": "c4696818-c28f-4798-a5c4-1a4b64a5a79f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 21 14:06:27 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-993420df-7193-4cf6-ae38-9b0897b85317 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Converting VIF {"id": "c4696818-c28f-4798-a5c4-1a4b64a5a79f", "address": "fa:16:3e:97:2a:26", "network": {"id": "12b23d1e-f3a6-4c34-989d-1c89ee946e24", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1972551477-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cfa1f4e6f7864477b911420ea2ecb982", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapc4696818-c2", "ovs_interfaceid": "c4696818-c28f-4798-a5c4-1a4b64a5a79f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 14:06:27 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-993420df-7193-4cf6-ae38-9b0897b85317 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:97:2a:26,bridge_name='br-int',has_traffic_filtering=True,id=c4696818-c28f-4798-a5c4-1a4b64a5a79f,network=Network(12b23d1e-f3a6-4c34-989d-1c89ee946e24),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc4696818-c2') {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 14:06:27 user nova-compute[71474]: DEBUG os_vif [None req-993420df-7193-4cf6-ae38-9b0897b85317 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:97:2a:26,bridge_name='br-int',has_traffic_filtering=True,id=c4696818-c28f-4798-a5c4-1a4b64a5a79f,network=Network(12b23d1e-f3a6-4c34-989d-1c89ee946e24),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc4696818-c2') {{(pid=71474) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 21 14:06:27 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:06:27 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc4696818-c2, bridge=br-int, if_exists=True) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 14:06:27 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:06:27 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 14:06:27 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:06:27 user nova-compute[71474]: INFO os_vif [None req-993420df-7193-4cf6-ae38-9b0897b85317 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:97:2a:26,bridge_name='br-int',has_traffic_filtering=True,id=c4696818-c28f-4798-a5c4-1a4b64a5a79f,network=Network(12b23d1e-f3a6-4c34-989d-1c89ee946e24),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc4696818-c2') Apr 21 14:06:27 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-993420df-7193-4cf6-ae38-9b0897b85317 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] [instance: 5cf0c20f-ffda-4578-adae-9aaef4c4bd18] Deleting instance files /opt/stack/data/nova/instances/5cf0c20f-ffda-4578-adae-9aaef4c4bd18_del Apr 21 14:06:27 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-993420df-7193-4cf6-ae38-9b0897b85317 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] [instance: 5cf0c20f-ffda-4578-adae-9aaef4c4bd18] Deletion of /opt/stack/data/nova/instances/5cf0c20f-ffda-4578-adae-9aaef4c4bd18_del complete Apr 21 14:06:27 user nova-compute[71474]: INFO nova.compute.manager [None req-993420df-7193-4cf6-ae38-9b0897b85317 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] [instance: 5cf0c20f-ffda-4578-adae-9aaef4c4bd18] Took 0.65 seconds to destroy the instance on the hypervisor. Apr 21 14:06:27 user nova-compute[71474]: DEBUG oslo.service.loopingcall [None req-993420df-7193-4cf6-ae38-9b0897b85317 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71474) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 21 14:06:27 user nova-compute[71474]: DEBUG nova.compute.manager [-] [instance: 5cf0c20f-ffda-4578-adae-9aaef4c4bd18] Deallocating network for instance {{(pid=71474) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 21 14:06:27 user nova-compute[71474]: DEBUG nova.network.neutron [-] [instance: 5cf0c20f-ffda-4578-adae-9aaef4c4bd18] deallocate_for_instance() {{(pid=71474) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 21 14:06:28 user nova-compute[71474]: DEBUG nova.network.neutron [-] [instance: 5cf0c20f-ffda-4578-adae-9aaef4c4bd18] Updating instance_info_cache with network_info: [] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:06:28 user nova-compute[71474]: INFO nova.compute.manager [-] [instance: 5cf0c20f-ffda-4578-adae-9aaef4c4bd18] Took 0.58 seconds to deallocate network for instance. Apr 21 14:06:28 user nova-compute[71474]: DEBUG nova.compute.manager [req-1b1940fc-4385-4eb7-a20d-b8cc7cbd0d08 req-4706eb9c-d3ec-4e2b-ad42-ed0dc643d1bb service nova] [instance: 5cf0c20f-ffda-4578-adae-9aaef4c4bd18] Received event network-vif-deleted-c4696818-c28f-4798-a5c4-1a4b64a5a79f {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:06:28 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-993420df-7193-4cf6-ae38-9b0897b85317 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:06:28 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-993420df-7193-4cf6-ae38-9b0897b85317 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:06:28 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-993420df-7193-4cf6-ae38-9b0897b85317 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Inventory has not changed in ProviderTree for provider: 4e62c1ab-67bb-43ed-8389-61deb50e98d7 {{(pid=71474) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 14:06:28 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-993420df-7193-4cf6-ae38-9b0897b85317 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Inventory has not changed for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 14:06:28 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-993420df-7193-4cf6-ae38-9b0897b85317 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.263s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:06:28 user nova-compute[71474]: INFO nova.scheduler.client.report [None req-993420df-7193-4cf6-ae38-9b0897b85317 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Deleted allocations for instance 5cf0c20f-ffda-4578-adae-9aaef4c4bd18 Apr 21 14:06:28 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-993420df-7193-4cf6-ae38-9b0897b85317 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Lock "5cf0c20f-ffda-4578-adae-9aaef4c4bd18" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.691s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:06:29 user nova-compute[71474]: DEBUG nova.compute.manager [req-79077539-734e-4203-9fee-3b940073fb93 req-b44b1fb7-50a9-4c85-ab27-45774729aaed service nova] [instance: 5cf0c20f-ffda-4578-adae-9aaef4c4bd18] Received event network-vif-plugged-c4696818-c28f-4798-a5c4-1a4b64a5a79f {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:06:29 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-79077539-734e-4203-9fee-3b940073fb93 req-b44b1fb7-50a9-4c85-ab27-45774729aaed service nova] Acquiring lock "5cf0c20f-ffda-4578-adae-9aaef4c4bd18-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:06:29 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-79077539-734e-4203-9fee-3b940073fb93 req-b44b1fb7-50a9-4c85-ab27-45774729aaed service nova] Lock "5cf0c20f-ffda-4578-adae-9aaef4c4bd18-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:06:29 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-79077539-734e-4203-9fee-3b940073fb93 req-b44b1fb7-50a9-4c85-ab27-45774729aaed service nova] Lock "5cf0c20f-ffda-4578-adae-9aaef4c4bd18-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:06:29 user nova-compute[71474]: DEBUG nova.compute.manager [req-79077539-734e-4203-9fee-3b940073fb93 req-b44b1fb7-50a9-4c85-ab27-45774729aaed service nova] [instance: 5cf0c20f-ffda-4578-adae-9aaef4c4bd18] No waiting events found dispatching network-vif-plugged-c4696818-c28f-4798-a5c4-1a4b64a5a79f {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:06:29 user nova-compute[71474]: WARNING nova.compute.manager [req-79077539-734e-4203-9fee-3b940073fb93 req-b44b1fb7-50a9-4c85-ab27-45774729aaed service nova] [instance: 5cf0c20f-ffda-4578-adae-9aaef4c4bd18] Received unexpected event network-vif-plugged-c4696818-c28f-4798-a5c4-1a4b64a5a79f for instance with vm_state deleted and task_state None. Apr 21 14:06:32 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:06:35 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-db2506a5-0957-44fb-92b9-d4c1690dee5b tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Acquiring lock "4a44d9f3-28b2-45e7-b952-2bb1735ef5b5" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:06:35 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-db2506a5-0957-44fb-92b9-d4c1690dee5b tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Lock "4a44d9f3-28b2-45e7-b952-2bb1735ef5b5" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:06:35 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-db2506a5-0957-44fb-92b9-d4c1690dee5b tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Acquiring lock "4a44d9f3-28b2-45e7-b952-2bb1735ef5b5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:06:35 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-db2506a5-0957-44fb-92b9-d4c1690dee5b tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Lock "4a44d9f3-28b2-45e7-b952-2bb1735ef5b5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:06:35 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-db2506a5-0957-44fb-92b9-d4c1690dee5b tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Lock "4a44d9f3-28b2-45e7-b952-2bb1735ef5b5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:06:35 user nova-compute[71474]: INFO nova.compute.manager [None req-db2506a5-0957-44fb-92b9-d4c1690dee5b tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] Terminating instance Apr 21 14:06:35 user nova-compute[71474]: DEBUG nova.compute.manager [None req-db2506a5-0957-44fb-92b9-d4c1690dee5b tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] Start destroying the instance on the hypervisor. {{(pid=71474) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 21 14:06:35 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:06:35 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:06:35 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:06:35 user nova-compute[71474]: DEBUG nova.compute.manager [req-f54bd141-6559-483f-ad24-abd48de581f2 req-ecd0fc60-31fe-418f-8786-d1d3539b4008 service nova] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] Received event network-vif-unplugged-5c5a52bd-c710-4614-8353-55be240cfa17 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:06:35 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-f54bd141-6559-483f-ad24-abd48de581f2 req-ecd0fc60-31fe-418f-8786-d1d3539b4008 service nova] Acquiring lock "4a44d9f3-28b2-45e7-b952-2bb1735ef5b5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:06:35 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-f54bd141-6559-483f-ad24-abd48de581f2 req-ecd0fc60-31fe-418f-8786-d1d3539b4008 service nova] Lock "4a44d9f3-28b2-45e7-b952-2bb1735ef5b5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:06:35 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-f54bd141-6559-483f-ad24-abd48de581f2 req-ecd0fc60-31fe-418f-8786-d1d3539b4008 service nova] Lock "4a44d9f3-28b2-45e7-b952-2bb1735ef5b5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:06:35 user nova-compute[71474]: DEBUG nova.compute.manager [req-f54bd141-6559-483f-ad24-abd48de581f2 req-ecd0fc60-31fe-418f-8786-d1d3539b4008 service nova] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] No waiting events found dispatching network-vif-unplugged-5c5a52bd-c710-4614-8353-55be240cfa17 {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:06:35 user nova-compute[71474]: DEBUG nova.compute.manager [req-f54bd141-6559-483f-ad24-abd48de581f2 req-ecd0fc60-31fe-418f-8786-d1d3539b4008 service nova] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] Received event network-vif-unplugged-5c5a52bd-c710-4614-8353-55be240cfa17 for instance with task_state deleting. {{(pid=71474) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 21 14:06:35 user nova-compute[71474]: INFO nova.virt.libvirt.driver [-] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] Instance destroyed successfully. Apr 21 14:06:35 user nova-compute[71474]: DEBUG nova.objects.instance [None req-db2506a5-0957-44fb-92b9-d4c1690dee5b tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Lazy-loading 'resources' on Instance uuid 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5 {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 14:06:36 user nova-compute[71474]: DEBUG nova.virt.libvirt.vif [None req-db2506a5-0957-44fb-92b9-d4c1690dee5b tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T14:02:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-694783197',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverbootfromvolumestablerescuetest-server-694783197',id=15,image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-21T14:02:52Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='885cdc1521a14985bfa70ae21e73c693',ramdisk_id='',reservation_id='r-nipn8ele',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-28514522',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-21T14:04:43Z,user_data=None,user_id='132913991f8c45c1adaf5db7ef7cea30',uuid=4a44d9f3-28b2-45e7-b952-2bb1735ef5b5,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5c5a52bd-c710-4614-8353-55be240cfa17", "address": "fa:16:3e:66:41:1d", "network": {"id": "4b38afb7-2b53-44fc-a4e0-7d79bef71734", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-935140606-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "885cdc1521a14985bfa70ae21e73c693", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c5a52bd-c7", "ovs_interfaceid": "5c5a52bd-c710-4614-8353-55be240cfa17", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 21 14:06:36 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-db2506a5-0957-44fb-92b9-d4c1690dee5b tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Converting VIF {"id": "5c5a52bd-c710-4614-8353-55be240cfa17", "address": "fa:16:3e:66:41:1d", "network": {"id": "4b38afb7-2b53-44fc-a4e0-7d79bef71734", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-935140606-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "885cdc1521a14985bfa70ae21e73c693", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c5a52bd-c7", "ovs_interfaceid": "5c5a52bd-c710-4614-8353-55be240cfa17", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 14:06:36 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-db2506a5-0957-44fb-92b9-d4c1690dee5b tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:66:41:1d,bridge_name='br-int',has_traffic_filtering=True,id=5c5a52bd-c710-4614-8353-55be240cfa17,network=Network(4b38afb7-2b53-44fc-a4e0-7d79bef71734),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5c5a52bd-c7') {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 14:06:36 user nova-compute[71474]: DEBUG os_vif [None req-db2506a5-0957-44fb-92b9-d4c1690dee5b tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:66:41:1d,bridge_name='br-int',has_traffic_filtering=True,id=5c5a52bd-c710-4614-8353-55be240cfa17,network=Network(4b38afb7-2b53-44fc-a4e0-7d79bef71734),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5c5a52bd-c7') {{(pid=71474) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 21 14:06:36 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:06:36 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5c5a52bd-c7, bridge=br-int, if_exists=True) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 14:06:36 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:06:36 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 14:06:36 user nova-compute[71474]: INFO os_vif [None req-db2506a5-0957-44fb-92b9-d4c1690dee5b tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:66:41:1d,bridge_name='br-int',has_traffic_filtering=True,id=5c5a52bd-c710-4614-8353-55be240cfa17,network=Network(4b38afb7-2b53-44fc-a4e0-7d79bef71734),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5c5a52bd-c7') Apr 21 14:06:36 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-db2506a5-0957-44fb-92b9-d4c1690dee5b tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] Deleting instance files /opt/stack/data/nova/instances/4a44d9f3-28b2-45e7-b952-2bb1735ef5b5_del Apr 21 14:06:36 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-db2506a5-0957-44fb-92b9-d4c1690dee5b tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] Deletion of /opt/stack/data/nova/instances/4a44d9f3-28b2-45e7-b952-2bb1735ef5b5_del complete Apr 21 14:06:36 user nova-compute[71474]: INFO nova.compute.manager [None req-db2506a5-0957-44fb-92b9-d4c1690dee5b tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] Took 0.65 seconds to destroy the instance on the hypervisor. Apr 21 14:06:36 user nova-compute[71474]: DEBUG oslo.service.loopingcall [None req-db2506a5-0957-44fb-92b9-d4c1690dee5b tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71474) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 21 14:06:36 user nova-compute[71474]: DEBUG nova.compute.manager [-] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] Deallocating network for instance {{(pid=71474) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 21 14:06:36 user nova-compute[71474]: DEBUG nova.network.neutron [-] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] deallocate_for_instance() {{(pid=71474) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 21 14:06:36 user nova-compute[71474]: DEBUG nova.network.neutron [-] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] Updating instance_info_cache with network_info: [] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:06:36 user nova-compute[71474]: INFO nova.compute.manager [-] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] Took 0.51 seconds to deallocate network for instance. Apr 21 14:06:36 user nova-compute[71474]: DEBUG nova.compute.manager [req-a3a550eb-2e63-433d-8a7e-10ab123506c9 req-efd30182-1efa-455a-b228-b430f881e0fc service nova] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] Received event network-vif-deleted-5c5a52bd-c710-4614-8353-55be240cfa17 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:06:36 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-db2506a5-0957-44fb-92b9-d4c1690dee5b tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:06:36 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-db2506a5-0957-44fb-92b9-d4c1690dee5b tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:06:36 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-db2506a5-0957-44fb-92b9-d4c1690dee5b tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Inventory has not changed in ProviderTree for provider: 4e62c1ab-67bb-43ed-8389-61deb50e98d7 {{(pid=71474) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 14:06:36 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-db2506a5-0957-44fb-92b9-d4c1690dee5b tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Inventory has not changed for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 14:06:36 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-db2506a5-0957-44fb-92b9-d4c1690dee5b tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.244s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:06:36 user nova-compute[71474]: INFO nova.scheduler.client.report [None req-db2506a5-0957-44fb-92b9-d4c1690dee5b tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Deleted allocations for instance 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5 Apr 21 14:06:36 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-db2506a5-0957-44fb-92b9-d4c1690dee5b tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Lock "4a44d9f3-28b2-45e7-b952-2bb1735ef5b5" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.572s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:06:37 user nova-compute[71474]: DEBUG nova.compute.manager [req-9d486a63-09ff-4817-9838-64aed935deb3 req-58881d9b-36cc-4ea6-9bba-d129c62d22ee service nova] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] Received event network-vif-plugged-5c5a52bd-c710-4614-8353-55be240cfa17 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:06:37 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-9d486a63-09ff-4817-9838-64aed935deb3 req-58881d9b-36cc-4ea6-9bba-d129c62d22ee service nova] Acquiring lock "4a44d9f3-28b2-45e7-b952-2bb1735ef5b5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:06:37 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-9d486a63-09ff-4817-9838-64aed935deb3 req-58881d9b-36cc-4ea6-9bba-d129c62d22ee service nova] Lock "4a44d9f3-28b2-45e7-b952-2bb1735ef5b5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:06:37 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-9d486a63-09ff-4817-9838-64aed935deb3 req-58881d9b-36cc-4ea6-9bba-d129c62d22ee service nova] Lock "4a44d9f3-28b2-45e7-b952-2bb1735ef5b5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:06:37 user nova-compute[71474]: DEBUG nova.compute.manager [req-9d486a63-09ff-4817-9838-64aed935deb3 req-58881d9b-36cc-4ea6-9bba-d129c62d22ee service nova] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] No waiting events found dispatching network-vif-plugged-5c5a52bd-c710-4614-8353-55be240cfa17 {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:06:37 user nova-compute[71474]: WARNING nova.compute.manager [req-9d486a63-09ff-4817-9838-64aed935deb3 req-58881d9b-36cc-4ea6-9bba-d129c62d22ee service nova] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] Received unexpected event network-vif-plugged-5c5a52bd-c710-4614-8353-55be240cfa17 for instance with vm_state deleted and task_state None. Apr 21 14:06:40 user nova-compute[71474]: DEBUG nova.compute.manager [req-97b5648a-a8ec-4b6d-9f98-9771ac6642ae req-3f31b161-6a27-47c8-ac35-fe259e98cbde service nova] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Received event network-changed-7eb11528-a882-4084-a2c7-b36fd432fecf {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:06:40 user nova-compute[71474]: DEBUG nova.compute.manager [req-97b5648a-a8ec-4b6d-9f98-9771ac6642ae req-3f31b161-6a27-47c8-ac35-fe259e98cbde service nova] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Refreshing instance network info cache due to event network-changed-7eb11528-a882-4084-a2c7-b36fd432fecf. {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 21 14:06:40 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-97b5648a-a8ec-4b6d-9f98-9771ac6642ae req-3f31b161-6a27-47c8-ac35-fe259e98cbde service nova] Acquiring lock "refresh_cache-b5e2e065-1b7d-4cbf-b31a-923ae2f92fff" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 14:06:40 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-97b5648a-a8ec-4b6d-9f98-9771ac6642ae req-3f31b161-6a27-47c8-ac35-fe259e98cbde service nova] Acquired lock "refresh_cache-b5e2e065-1b7d-4cbf-b31a-923ae2f92fff" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 14:06:40 user nova-compute[71474]: DEBUG nova.network.neutron [req-97b5648a-a8ec-4b6d-9f98-9771ac6642ae req-3f31b161-6a27-47c8-ac35-fe259e98cbde service nova] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Refreshing network info cache for port 7eb11528-a882-4084-a2c7-b36fd432fecf {{(pid=71474) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 21 14:06:40 user nova-compute[71474]: DEBUG nova.network.neutron [req-97b5648a-a8ec-4b6d-9f98-9771ac6642ae req-3f31b161-6a27-47c8-ac35-fe259e98cbde service nova] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Updated VIF entry in instance network info cache for port 7eb11528-a882-4084-a2c7-b36fd432fecf. {{(pid=71474) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 21 14:06:40 user nova-compute[71474]: DEBUG nova.network.neutron [req-97b5648a-a8ec-4b6d-9f98-9771ac6642ae req-3f31b161-6a27-47c8-ac35-fe259e98cbde service nova] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Updating instance_info_cache with network_info: [{"id": "7eb11528-a882-4084-a2c7-b36fd432fecf", "address": "fa:16:3e:05:9e:ea", "network": {"id": "d9138a89-3d80-4ef8-b937-1613f614c9e8", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-2095900346-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.101", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "41c39fcb224f4e69a73734be43ba6588", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7eb11528-a8", "ovs_interfaceid": "7eb11528-a882-4084-a2c7-b36fd432fecf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:06:40 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-97b5648a-a8ec-4b6d-9f98-9771ac6642ae req-3f31b161-6a27-47c8-ac35-fe259e98cbde service nova] Releasing lock "refresh_cache-b5e2e065-1b7d-4cbf-b31a-923ae2f92fff" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 14:06:41 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:06:41 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:06:42 user nova-compute[71474]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71474) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 14:06:42 user nova-compute[71474]: INFO nova.compute.manager [-] [instance: 5cf0c20f-ffda-4578-adae-9aaef4c4bd18] VM Stopped (Lifecycle Event) Apr 21 14:06:42 user nova-compute[71474]: DEBUG nova.compute.manager [None req-7beca62c-b8f4-433b-854e-9cbd0a5ca658 None None] [instance: 5cf0c20f-ffda-4578-adae-9aaef4c4bd18] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 14:06:44 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Acquiring lock "91696ea3-6e52-4506-ba4d-7f87f7b9f5b1" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:06:44 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Lock "91696ea3-6e52-4506-ba4d-7f87f7b9f5b1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:06:44 user nova-compute[71474]: DEBUG nova.compute.manager [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1] Starting instance... {{(pid=71474) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 21 14:06:44 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:06:44 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:06:44 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71474) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 21 14:06:44 user nova-compute[71474]: INFO nova.compute.claims [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1] Claim successful on node user Apr 21 14:06:44 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-13e7cd54-9120-4d9b-93c9-dc1ce9c82cd3 tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Acquiring lock "30068c4a-94ed-4b84-9178-0d554326fc68" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:06:44 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-13e7cd54-9120-4d9b-93c9-dc1ce9c82cd3 tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Lock "30068c4a-94ed-4b84-9178-0d554326fc68" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:06:44 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-13e7cd54-9120-4d9b-93c9-dc1ce9c82cd3 tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Acquiring lock "30068c4a-94ed-4b84-9178-0d554326fc68-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:06:44 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-13e7cd54-9120-4d9b-93c9-dc1ce9c82cd3 tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Lock "30068c4a-94ed-4b84-9178-0d554326fc68-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:06:44 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-13e7cd54-9120-4d9b-93c9-dc1ce9c82cd3 tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Lock "30068c4a-94ed-4b84-9178-0d554326fc68-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:06:44 user nova-compute[71474]: INFO nova.compute.manager [None req-13e7cd54-9120-4d9b-93c9-dc1ce9c82cd3 tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] Terminating instance Apr 21 14:06:44 user nova-compute[71474]: DEBUG nova.compute.manager [None req-13e7cd54-9120-4d9b-93c9-dc1ce9c82cd3 tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] Start destroying the instance on the hypervisor. {{(pid=71474) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 21 14:06:44 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:06:44 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:06:44 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:06:44 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Inventory has not changed in ProviderTree for provider: 4e62c1ab-67bb-43ed-8389-61deb50e98d7 {{(pid=71474) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 14:06:45 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Inventory has not changed for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 14:06:45 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.361s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:06:45 user nova-compute[71474]: DEBUG nova.compute.manager [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1] Start building networks asynchronously for instance. {{(pid=71474) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 21 14:06:45 user nova-compute[71474]: DEBUG nova.compute.manager [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1] Allocating IP information in the background. {{(pid=71474) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 21 14:06:45 user nova-compute[71474]: DEBUG nova.network.neutron [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1] allocate_for_instance() {{(pid=71474) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 21 14:06:45 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 21 14:06:45 user nova-compute[71474]: DEBUG nova.compute.manager [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1] Start building block device mappings for instance. {{(pid=71474) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 21 14:06:45 user nova-compute[71474]: DEBUG nova.compute.manager [req-5ea70900-6392-4291-a0ee-798dd71cd002 req-7245b471-93f4-4d2e-b650-60a98ca3c4e0 service nova] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] Received event network-vif-unplugged-7361228d-9a8e-4921-9cb8-fc59a0a45063 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:06:45 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-5ea70900-6392-4291-a0ee-798dd71cd002 req-7245b471-93f4-4d2e-b650-60a98ca3c4e0 service nova] Acquiring lock "30068c4a-94ed-4b84-9178-0d554326fc68-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:06:45 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-5ea70900-6392-4291-a0ee-798dd71cd002 req-7245b471-93f4-4d2e-b650-60a98ca3c4e0 service nova] Lock "30068c4a-94ed-4b84-9178-0d554326fc68-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:06:45 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-5ea70900-6392-4291-a0ee-798dd71cd002 req-7245b471-93f4-4d2e-b650-60a98ca3c4e0 service nova] Lock "30068c4a-94ed-4b84-9178-0d554326fc68-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:06:45 user nova-compute[71474]: DEBUG nova.compute.manager [req-5ea70900-6392-4291-a0ee-798dd71cd002 req-7245b471-93f4-4d2e-b650-60a98ca3c4e0 service nova] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] No waiting events found dispatching network-vif-unplugged-7361228d-9a8e-4921-9cb8-fc59a0a45063 {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:06:45 user nova-compute[71474]: DEBUG nova.compute.manager [req-5ea70900-6392-4291-a0ee-798dd71cd002 req-7245b471-93f4-4d2e-b650-60a98ca3c4e0 service nova] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] Received event network-vif-unplugged-7361228d-9a8e-4921-9cb8-fc59a0a45063 for instance with task_state deleting. {{(pid=71474) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 21 14:06:45 user nova-compute[71474]: DEBUG nova.compute.manager [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1] Start spawning the instance on the hypervisor. {{(pid=71474) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 21 14:06:45 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1] Creating instance directory {{(pid=71474) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 21 14:06:45 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1] Creating image(s) Apr 21 14:06:45 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Acquiring lock "/opt/stack/data/nova/instances/91696ea3-6e52-4506-ba4d-7f87f7b9f5b1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:06:45 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Lock "/opt/stack/data/nova/instances/91696ea3-6e52-4506-ba4d-7f87f7b9f5b1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:06:45 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Lock "/opt/stack/data/nova/instances/91696ea3-6e52-4506-ba4d-7f87f7b9f5b1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:06:45 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:06:45 user nova-compute[71474]: DEBUG nova.policy [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ab1d2ed7df2f4a9bbf14da7e2c5fece2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f0ccc2c950364fcbb0f2b1cc937f6a82', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71474) authorize /opt/stack/nova/nova/policy.py:203}} Apr 21 14:06:45 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json" returned: 0 in 0.135s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:06:45 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Acquiring lock "8e8c288cb98f22f6af31ad55f38b7baa81c260d7" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:06:45 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Lock "8e8c288cb98f22f6af31ad55f38b7baa81c260d7" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:06:45 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:06:45 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:06:45 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json" returned: 0 in 0.143s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:06:45 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7,backing_fmt=raw /opt/stack/data/nova/instances/91696ea3-6e52-4506-ba4d-7f87f7b9f5b1/disk 1073741824 {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:06:45 user nova-compute[71474]: INFO nova.virt.libvirt.driver [-] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] Instance destroyed successfully. Apr 21 14:06:45 user nova-compute[71474]: DEBUG nova.objects.instance [None req-13e7cd54-9120-4d9b-93c9-dc1ce9c82cd3 tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Lazy-loading 'resources' on Instance uuid 30068c4a-94ed-4b84-9178-0d554326fc68 {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 14:06:45 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7,backing_fmt=raw /opt/stack/data/nova/instances/91696ea3-6e52-4506-ba4d-7f87f7b9f5b1/disk 1073741824" returned: 0 in 0.049s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:06:45 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Lock "8e8c288cb98f22f6af31ad55f38b7baa81c260d7" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.195s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:06:45 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:06:45 user nova-compute[71474]: DEBUG nova.virt.libvirt.vif [None req-13e7cd54-9120-4d9b-93c9-dc1ce9c82cd3 tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T13:58:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1791477557',display_name='tempest-ServersNegativeTestJSON-server-1791477557',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serversnegativetestjson-server-1791477557',id=2,image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-21T13:58:19Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='a8c210480b33473c91156b798bcbd8b2',ramdisk_id='',reservation_id='r-0bp8i0q4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServersNegativeTestJSON-1552178734',owner_user_name='tempest-ServersNegativeTestJSON-1552178734-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-21T13:58:19Z,user_data=None,user_id='2259f365261c49b28b56ddd1c27c125d',uuid=30068c4a-94ed-4b84-9178-0d554326fc68,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7361228d-9a8e-4921-9cb8-fc59a0a45063", "address": "fa:16:3e:3c:01:3d", "network": {"id": "d567294b-c36b-4268-af90-17560e0c43e4", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1033838809-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a8c210480b33473c91156b798bcbd8b2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7361228d-9a", "ovs_interfaceid": "7361228d-9a8e-4921-9cb8-fc59a0a45063", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 21 14:06:45 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-13e7cd54-9120-4d9b-93c9-dc1ce9c82cd3 tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Converting VIF {"id": "7361228d-9a8e-4921-9cb8-fc59a0a45063", "address": "fa:16:3e:3c:01:3d", "network": {"id": "d567294b-c36b-4268-af90-17560e0c43e4", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1033838809-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a8c210480b33473c91156b798bcbd8b2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7361228d-9a", "ovs_interfaceid": "7361228d-9a8e-4921-9cb8-fc59a0a45063", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 14:06:45 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-13e7cd54-9120-4d9b-93c9-dc1ce9c82cd3 tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:3c:01:3d,bridge_name='br-int',has_traffic_filtering=True,id=7361228d-9a8e-4921-9cb8-fc59a0a45063,network=Network(d567294b-c36b-4268-af90-17560e0c43e4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7361228d-9a') {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 14:06:45 user nova-compute[71474]: DEBUG os_vif [None req-13e7cd54-9120-4d9b-93c9-dc1ce9c82cd3 tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:3c:01:3d,bridge_name='br-int',has_traffic_filtering=True,id=7361228d-9a8e-4921-9cb8-fc59a0a45063,network=Network(d567294b-c36b-4268-af90-17560e0c43e4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7361228d-9a') {{(pid=71474) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 21 14:06:45 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:06:45 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7361228d-9a, bridge=br-int, if_exists=True) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 14:06:45 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:06:45 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 14:06:45 user nova-compute[71474]: INFO os_vif [None req-13e7cd54-9120-4d9b-93c9-dc1ce9c82cd3 tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:3c:01:3d,bridge_name='br-int',has_traffic_filtering=True,id=7361228d-9a8e-4921-9cb8-fc59a0a45063,network=Network(d567294b-c36b-4268-af90-17560e0c43e4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7361228d-9a') Apr 21 14:06:45 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-13e7cd54-9120-4d9b-93c9-dc1ce9c82cd3 tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] Deleting instance files /opt/stack/data/nova/instances/30068c4a-94ed-4b84-9178-0d554326fc68_del Apr 21 14:06:45 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-13e7cd54-9120-4d9b-93c9-dc1ce9c82cd3 tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] Deletion of /opt/stack/data/nova/instances/30068c4a-94ed-4b84-9178-0d554326fc68_del complete Apr 21 14:06:45 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json" returned: 0 in 0.135s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:06:45 user nova-compute[71474]: DEBUG nova.virt.disk.api [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Checking if we can resize image /opt/stack/data/nova/instances/91696ea3-6e52-4506-ba4d-7f87f7b9f5b1/disk. size=1073741824 {{(pid=71474) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 21 14:06:45 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/91696ea3-6e52-4506-ba4d-7f87f7b9f5b1/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:06:45 user nova-compute[71474]: INFO nova.compute.manager [None req-13e7cd54-9120-4d9b-93c9-dc1ce9c82cd3 tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] Took 0.95 seconds to destroy the instance on the hypervisor. Apr 21 14:06:45 user nova-compute[71474]: DEBUG oslo.service.loopingcall [None req-13e7cd54-9120-4d9b-93c9-dc1ce9c82cd3 tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71474) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 21 14:06:45 user nova-compute[71474]: DEBUG nova.compute.manager [-] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] Deallocating network for instance {{(pid=71474) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 21 14:06:45 user nova-compute[71474]: DEBUG nova.network.neutron [-] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] deallocate_for_instance() {{(pid=71474) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 21 14:06:45 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/91696ea3-6e52-4506-ba4d-7f87f7b9f5b1/disk --force-share --output=json" returned: 0 in 0.133s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:06:45 user nova-compute[71474]: DEBUG nova.virt.disk.api [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Cannot resize image /opt/stack/data/nova/instances/91696ea3-6e52-4506-ba4d-7f87f7b9f5b1/disk to a smaller size. {{(pid=71474) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 21 14:06:45 user nova-compute[71474]: DEBUG nova.objects.instance [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Lazy-loading 'migration_context' on Instance uuid 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1 {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 14:06:45 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1] Created local disks {{(pid=71474) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 21 14:06:45 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1] Ensure instance console log exists: /opt/stack/data/nova/instances/91696ea3-6e52-4506-ba4d-7f87f7b9f5b1/console.log {{(pid=71474) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 21 14:06:45 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:06:45 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:06:45 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:06:46 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:06:46 user nova-compute[71474]: DEBUG nova.network.neutron [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1] Successfully created port: 0b9909b1-cbc2-4a32-9744-599b789730dc {{(pid=71474) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 21 14:06:46 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:06:46 user nova-compute[71474]: DEBUG nova.network.neutron [-] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] Updating instance_info_cache with network_info: [] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:06:46 user nova-compute[71474]: INFO nova.compute.manager [-] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] Took 0.61 seconds to deallocate network for instance. Apr 21 14:06:46 user nova-compute[71474]: DEBUG nova.compute.manager [req-384a5a86-af78-4b7a-a6bc-bf1584069bf0 req-3353a2c2-83fe-4872-b3b3-806691a99e5e service nova] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] Received event network-vif-deleted-7361228d-9a8e-4921-9cb8-fc59a0a45063 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:06:46 user nova-compute[71474]: INFO nova.compute.manager [req-384a5a86-af78-4b7a-a6bc-bf1584069bf0 req-3353a2c2-83fe-4872-b3b3-806691a99e5e service nova] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] Neutron deleted interface 7361228d-9a8e-4921-9cb8-fc59a0a45063; detaching it from the instance and deleting it from the info cache Apr 21 14:06:46 user nova-compute[71474]: DEBUG nova.network.neutron [req-384a5a86-af78-4b7a-a6bc-bf1584069bf0 req-3353a2c2-83fe-4872-b3b3-806691a99e5e service nova] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] Updating instance_info_cache with network_info: [] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:06:46 user nova-compute[71474]: DEBUG nova.compute.manager [req-384a5a86-af78-4b7a-a6bc-bf1584069bf0 req-3353a2c2-83fe-4872-b3b3-806691a99e5e service nova] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] Detach interface failed, port_id=7361228d-9a8e-4921-9cb8-fc59a0a45063, reason: Instance 30068c4a-94ed-4b84-9178-0d554326fc68 could not be found. {{(pid=71474) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:10816}} Apr 21 14:06:46 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-13e7cd54-9120-4d9b-93c9-dc1ce9c82cd3 tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:06:46 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-13e7cd54-9120-4d9b-93c9-dc1ce9c82cd3 tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:06:46 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-13e7cd54-9120-4d9b-93c9-dc1ce9c82cd3 tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Inventory has not changed in ProviderTree for provider: 4e62c1ab-67bb-43ed-8389-61deb50e98d7 {{(pid=71474) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 14:06:46 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-13e7cd54-9120-4d9b-93c9-dc1ce9c82cd3 tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Inventory has not changed for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 14:06:46 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-13e7cd54-9120-4d9b-93c9-dc1ce9c82cd3 tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.221s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:06:46 user nova-compute[71474]: INFO nova.scheduler.client.report [None req-13e7cd54-9120-4d9b-93c9-dc1ce9c82cd3 tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Deleted allocations for instance 30068c4a-94ed-4b84-9178-0d554326fc68 Apr 21 14:06:46 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-13e7cd54-9120-4d9b-93c9-dc1ce9c82cd3 tempest-ServersNegativeTestJSON-1552178734 tempest-ServersNegativeTestJSON-1552178734-project-member] Lock "30068c4a-94ed-4b84-9178-0d554326fc68" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.960s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:06:47 user nova-compute[71474]: DEBUG nova.network.neutron [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1] Successfully updated port: 0b9909b1-cbc2-4a32-9744-599b789730dc {{(pid=71474) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 21 14:06:47 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Acquiring lock "refresh_cache-91696ea3-6e52-4506-ba4d-7f87f7b9f5b1" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 14:06:47 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Acquired lock "refresh_cache-91696ea3-6e52-4506-ba4d-7f87f7b9f5b1" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 14:06:47 user nova-compute[71474]: DEBUG nova.network.neutron [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1] Building network info cache for instance {{(pid=71474) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 21 14:06:47 user nova-compute[71474]: DEBUG nova.network.neutron [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1] Instance cache missing network info. {{(pid=71474) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 21 14:06:47 user nova-compute[71474]: DEBUG nova.compute.manager [req-f2105737-e576-491a-9f1f-eb11b64decb0 req-90e89e04-70d4-4b63-b9dd-50a71d35d670 service nova] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] Received event network-vif-plugged-7361228d-9a8e-4921-9cb8-fc59a0a45063 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:06:47 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-f2105737-e576-491a-9f1f-eb11b64decb0 req-90e89e04-70d4-4b63-b9dd-50a71d35d670 service nova] Acquiring lock "30068c4a-94ed-4b84-9178-0d554326fc68-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:06:47 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-f2105737-e576-491a-9f1f-eb11b64decb0 req-90e89e04-70d4-4b63-b9dd-50a71d35d670 service nova] Lock "30068c4a-94ed-4b84-9178-0d554326fc68-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:06:47 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-f2105737-e576-491a-9f1f-eb11b64decb0 req-90e89e04-70d4-4b63-b9dd-50a71d35d670 service nova] Lock "30068c4a-94ed-4b84-9178-0d554326fc68-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:06:47 user nova-compute[71474]: DEBUG nova.compute.manager [req-f2105737-e576-491a-9f1f-eb11b64decb0 req-90e89e04-70d4-4b63-b9dd-50a71d35d670 service nova] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] No waiting events found dispatching network-vif-plugged-7361228d-9a8e-4921-9cb8-fc59a0a45063 {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:06:47 user nova-compute[71474]: WARNING nova.compute.manager [req-f2105737-e576-491a-9f1f-eb11b64decb0 req-90e89e04-70d4-4b63-b9dd-50a71d35d670 service nova] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] Received unexpected event network-vif-plugged-7361228d-9a8e-4921-9cb8-fc59a0a45063 for instance with vm_state deleted and task_state None. Apr 21 14:06:47 user nova-compute[71474]: DEBUG nova.compute.manager [req-f2105737-e576-491a-9f1f-eb11b64decb0 req-90e89e04-70d4-4b63-b9dd-50a71d35d670 service nova] [instance: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1] Received event network-changed-0b9909b1-cbc2-4a32-9744-599b789730dc {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:06:47 user nova-compute[71474]: DEBUG nova.compute.manager [req-f2105737-e576-491a-9f1f-eb11b64decb0 req-90e89e04-70d4-4b63-b9dd-50a71d35d670 service nova] [instance: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1] Refreshing instance network info cache due to event network-changed-0b9909b1-cbc2-4a32-9744-599b789730dc. {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 21 14:06:47 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-f2105737-e576-491a-9f1f-eb11b64decb0 req-90e89e04-70d4-4b63-b9dd-50a71d35d670 service nova] Acquiring lock "refresh_cache-91696ea3-6e52-4506-ba4d-7f87f7b9f5b1" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 14:06:47 user nova-compute[71474]: DEBUG nova.network.neutron [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1] Updating instance_info_cache with network_info: [{"id": "0b9909b1-cbc2-4a32-9744-599b789730dc", "address": "fa:16:3e:47:d6:29", "network": {"id": "31b07b9f-0a0f-426a-97d6-12b23e611818", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1809206062-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f0ccc2c950364fcbb0f2b1cc937f6a82", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b9909b1-cb", "ovs_interfaceid": "0b9909b1-cbc2-4a32-9744-599b789730dc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:06:47 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Releasing lock "refresh_cache-91696ea3-6e52-4506-ba4d-7f87f7b9f5b1" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 14:06:47 user nova-compute[71474]: DEBUG nova.compute.manager [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1] Instance network_info: |[{"id": "0b9909b1-cbc2-4a32-9744-599b789730dc", "address": "fa:16:3e:47:d6:29", "network": {"id": "31b07b9f-0a0f-426a-97d6-12b23e611818", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1809206062-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f0ccc2c950364fcbb0f2b1cc937f6a82", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b9909b1-cb", "ovs_interfaceid": "0b9909b1-cbc2-4a32-9744-599b789730dc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71474) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 21 14:06:47 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-f2105737-e576-491a-9f1f-eb11b64decb0 req-90e89e04-70d4-4b63-b9dd-50a71d35d670 service nova] Acquired lock "refresh_cache-91696ea3-6e52-4506-ba4d-7f87f7b9f5b1" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 14:06:47 user nova-compute[71474]: DEBUG nova.network.neutron [req-f2105737-e576-491a-9f1f-eb11b64decb0 req-90e89e04-70d4-4b63-b9dd-50a71d35d670 service nova] [instance: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1] Refreshing network info cache for port 0b9909b1-cbc2-4a32-9744-599b789730dc {{(pid=71474) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 21 14:06:47 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1] Start _get_guest_xml network_info=[{"id": "0b9909b1-cbc2-4a32-9744-599b789730dc", "address": "fa:16:3e:47:d6:29", "network": {"id": "31b07b9f-0a0f-426a-97d6-12b23e611818", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1809206062-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f0ccc2c950364fcbb0f2b1cc937f6a82", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b9909b1-cb", "ovs_interfaceid": "0b9909b1-cbc2-4a32-9744-599b789730dc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T13:54:16Z,direct_url=,disk_format='qcow2',id=2edfef44-2867-4e03-a53e-b139f99afa75,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='36a44032fda748c1965c722304fa176d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T13:54:18Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'guest_format': None, 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'image_id': '2edfef44-2867-4e03-a53e-b139f99afa75'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71474) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 21 14:06:47 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 14:06:47 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 14:06:47 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71474) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 21 14:06:47 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-21T13:55:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T13:54:16Z,direct_url=,disk_format='qcow2',id=2edfef44-2867-4e03-a53e-b139f99afa75,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='36a44032fda748c1965c722304fa176d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T13:54:18Z,virtual_size=,visibility=), allow threads: True {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 21 14:06:47 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Flavor limits 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 21 14:06:47 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Image limits 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 21 14:06:47 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Flavor pref 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 21 14:06:47 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Image pref 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 21 14:06:47 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 21 14:06:47 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 21 14:06:47 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71474) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 21 14:06:47 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Got 1 possible topologies {{(pid=71474) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 21 14:06:47 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 21 14:06:47 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 21 14:06:47 user nova-compute[71474]: DEBUG nova.virt.libvirt.vif [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T14:06:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-1414365371',display_name='tempest-AttachVolumeNegativeTest-server-1414365371',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-1414365371',id=22,image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHVTbSwwxECYrofWLzM3xM2athtWkhHO4PmnRUvV4IeHkFrsz3GVwS5pKQyGAUvFsHgrVRcBmNgHjdWVkJa8/B3vkVSYjn5BwhRB1DM72Kz9Nxe+lrLjXM+s4ubHvbbIIg==',key_name='tempest-keypair-571092211',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f0ccc2c950364fcbb0f2b1cc937f6a82',ramdisk_id='',reservation_id='r-ji6g5x0c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-166063504',owner_user_name='tempest-AttachVolumeNegativeTest-166063504-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T14:06:45Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='ab1d2ed7df2f4a9bbf14da7e2c5fece2',uuid=91696ea3-6e52-4506-ba4d-7f87f7b9f5b1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0b9909b1-cbc2-4a32-9744-599b789730dc", "address": "fa:16:3e:47:d6:29", "network": {"id": "31b07b9f-0a0f-426a-97d6-12b23e611818", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1809206062-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f0ccc2c950364fcbb0f2b1cc937f6a82", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b9909b1-cb", "ovs_interfaceid": "0b9909b1-cbc2-4a32-9744-599b789730dc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71474) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 21 14:06:47 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Converting VIF {"id": "0b9909b1-cbc2-4a32-9744-599b789730dc", "address": "fa:16:3e:47:d6:29", "network": {"id": "31b07b9f-0a0f-426a-97d6-12b23e611818", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1809206062-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f0ccc2c950364fcbb0f2b1cc937f6a82", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b9909b1-cb", "ovs_interfaceid": "0b9909b1-cbc2-4a32-9744-599b789730dc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 14:06:47 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:47:d6:29,bridge_name='br-int',has_traffic_filtering=True,id=0b9909b1-cbc2-4a32-9744-599b789730dc,network=Network(31b07b9f-0a0f-426a-97d6-12b23e611818),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b9909b1-cb') {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 14:06:47 user nova-compute[71474]: DEBUG nova.objects.instance [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Lazy-loading 'pci_devices' on Instance uuid 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1 {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 14:06:47 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1] End _get_guest_xml xml= Apr 21 14:06:47 user nova-compute[71474]: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1 Apr 21 14:06:47 user nova-compute[71474]: instance-00000016 Apr 21 14:06:47 user nova-compute[71474]: 131072 Apr 21 14:06:47 user nova-compute[71474]: 1 Apr 21 14:06:47 user nova-compute[71474]: Apr 21 14:06:47 user nova-compute[71474]: Apr 21 14:06:47 user nova-compute[71474]: Apr 21 14:06:47 user nova-compute[71474]: tempest-AttachVolumeNegativeTest-server-1414365371 Apr 21 14:06:47 user nova-compute[71474]: 2023-04-21 14:06:47 Apr 21 14:06:47 user nova-compute[71474]: Apr 21 14:06:47 user nova-compute[71474]: 128 Apr 21 14:06:47 user nova-compute[71474]: 1 Apr 21 14:06:47 user nova-compute[71474]: 0 Apr 21 14:06:47 user nova-compute[71474]: 0 Apr 21 14:06:47 user nova-compute[71474]: 1 Apr 21 14:06:47 user nova-compute[71474]: Apr 21 14:06:47 user nova-compute[71474]: Apr 21 14:06:47 user nova-compute[71474]: tempest-AttachVolumeNegativeTest-166063504-project-member Apr 21 14:06:47 user nova-compute[71474]: tempest-AttachVolumeNegativeTest-166063504 Apr 21 14:06:47 user nova-compute[71474]: Apr 21 14:06:47 user nova-compute[71474]: Apr 21 14:06:47 user nova-compute[71474]: Apr 21 14:06:47 user nova-compute[71474]: Apr 21 14:06:47 user nova-compute[71474]: Apr 21 14:06:47 user nova-compute[71474]: Apr 21 14:06:47 user nova-compute[71474]: Apr 21 14:06:47 user nova-compute[71474]: Apr 21 14:06:47 user nova-compute[71474]: Apr 21 14:06:47 user nova-compute[71474]: Apr 21 14:06:47 user nova-compute[71474]: Apr 21 14:06:47 user nova-compute[71474]: OpenStack Foundation Apr 21 14:06:47 user nova-compute[71474]: OpenStack Nova Apr 21 14:06:47 user nova-compute[71474]: 0.0.0 Apr 21 14:06:47 user nova-compute[71474]: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1 Apr 21 14:06:47 user nova-compute[71474]: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1 Apr 21 14:06:47 user nova-compute[71474]: Virtual Machine Apr 21 14:06:47 user nova-compute[71474]: Apr 21 14:06:47 user nova-compute[71474]: Apr 21 14:06:47 user nova-compute[71474]: Apr 21 14:06:47 user nova-compute[71474]: hvm Apr 21 14:06:47 user nova-compute[71474]: Apr 21 14:06:47 user nova-compute[71474]: Apr 21 14:06:47 user nova-compute[71474]: Apr 21 14:06:47 user nova-compute[71474]: Apr 21 14:06:47 user nova-compute[71474]: Apr 21 14:06:47 user nova-compute[71474]: Apr 21 14:06:47 user nova-compute[71474]: Apr 21 14:06:47 user nova-compute[71474]: Apr 21 14:06:47 user nova-compute[71474]: Apr 21 14:06:47 user nova-compute[71474]: Apr 21 14:06:47 user nova-compute[71474]: Apr 21 14:06:47 user nova-compute[71474]: Apr 21 14:06:47 user nova-compute[71474]: Apr 21 14:06:47 user nova-compute[71474]: Apr 21 14:06:47 user nova-compute[71474]: Nehalem Apr 21 14:06:47 user nova-compute[71474]: Apr 21 14:06:47 user nova-compute[71474]: Apr 21 14:06:47 user nova-compute[71474]: Apr 21 14:06:47 user nova-compute[71474]: Apr 21 14:06:47 user nova-compute[71474]: Apr 21 14:06:47 user nova-compute[71474]: Apr 21 14:06:47 user nova-compute[71474]: Apr 21 14:06:47 user nova-compute[71474]: Apr 21 14:06:47 user nova-compute[71474]: Apr 21 14:06:47 user nova-compute[71474]: Apr 21 14:06:47 user nova-compute[71474]: Apr 21 14:06:47 user nova-compute[71474]: Apr 21 14:06:47 user nova-compute[71474]: Apr 21 14:06:47 user nova-compute[71474]: Apr 21 14:06:47 user nova-compute[71474]: Apr 21 14:06:47 user nova-compute[71474]: Apr 21 14:06:47 user nova-compute[71474]: Apr 21 14:06:47 user nova-compute[71474]: Apr 21 14:06:47 user nova-compute[71474]: Apr 21 14:06:47 user nova-compute[71474]: Apr 21 14:06:47 user nova-compute[71474]: /dev/urandom Apr 21 14:06:47 user nova-compute[71474]: Apr 21 14:06:47 user nova-compute[71474]: Apr 21 14:06:47 user nova-compute[71474]: Apr 21 14:06:47 user nova-compute[71474]: Apr 21 14:06:47 user nova-compute[71474]: Apr 21 14:06:47 user nova-compute[71474]: Apr 21 14:06:47 user nova-compute[71474]: Apr 21 14:06:47 user nova-compute[71474]: {{(pid=71474) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 21 14:06:47 user nova-compute[71474]: DEBUG nova.virt.libvirt.vif [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T14:06:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-1414365371',display_name='tempest-AttachVolumeNegativeTest-server-1414365371',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-1414365371',id=22,image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHVTbSwwxECYrofWLzM3xM2athtWkhHO4PmnRUvV4IeHkFrsz3GVwS5pKQyGAUvFsHgrVRcBmNgHjdWVkJa8/B3vkVSYjn5BwhRB1DM72Kz9Nxe+lrLjXM+s4ubHvbbIIg==',key_name='tempest-keypair-571092211',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f0ccc2c950364fcbb0f2b1cc937f6a82',ramdisk_id='',reservation_id='r-ji6g5x0c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-166063504',owner_user_name='tempest-AttachVolumeNegativeTest-166063504-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T14:06:45Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='ab1d2ed7df2f4a9bbf14da7e2c5fece2',uuid=91696ea3-6e52-4506-ba4d-7f87f7b9f5b1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0b9909b1-cbc2-4a32-9744-599b789730dc", "address": "fa:16:3e:47:d6:29", "network": {"id": "31b07b9f-0a0f-426a-97d6-12b23e611818", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1809206062-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f0ccc2c950364fcbb0f2b1cc937f6a82", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b9909b1-cb", "ovs_interfaceid": "0b9909b1-cbc2-4a32-9744-599b789730dc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 21 14:06:47 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Converting VIF {"id": "0b9909b1-cbc2-4a32-9744-599b789730dc", "address": "fa:16:3e:47:d6:29", "network": {"id": "31b07b9f-0a0f-426a-97d6-12b23e611818", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1809206062-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f0ccc2c950364fcbb0f2b1cc937f6a82", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b9909b1-cb", "ovs_interfaceid": "0b9909b1-cbc2-4a32-9744-599b789730dc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 14:06:47 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:47:d6:29,bridge_name='br-int',has_traffic_filtering=True,id=0b9909b1-cbc2-4a32-9744-599b789730dc,network=Network(31b07b9f-0a0f-426a-97d6-12b23e611818),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b9909b1-cb') {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 14:06:47 user nova-compute[71474]: DEBUG os_vif [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:47:d6:29,bridge_name='br-int',has_traffic_filtering=True,id=0b9909b1-cbc2-4a32-9744-599b789730dc,network=Network(31b07b9f-0a0f-426a-97d6-12b23e611818),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b9909b1-cb') {{(pid=71474) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 21 14:06:47 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:06:47 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 14:06:47 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 21 14:06:47 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:06:47 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0b9909b1-cb, may_exist=True) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 14:06:47 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0b9909b1-cb, col_values=(('external_ids', {'iface-id': '0b9909b1-cbc2-4a32-9744-599b789730dc', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:47:d6:29', 'vm-uuid': '91696ea3-6e52-4506-ba4d-7f87f7b9f5b1'}),)) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 14:06:47 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:06:47 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 14:06:47 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:06:47 user nova-compute[71474]: INFO os_vif [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:47:d6:29,bridge_name='br-int',has_traffic_filtering=True,id=0b9909b1-cbc2-4a32-9744-599b789730dc,network=Network(31b07b9f-0a0f-426a-97d6-12b23e611818),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b9909b1-cb') Apr 21 14:06:47 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] No BDM found with device name vda, not building metadata. {{(pid=71474) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 21 14:06:47 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] No VIF found with MAC fa:16:3e:47:d6:29, not building metadata {{(pid=71474) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 21 14:06:47 user nova-compute[71474]: DEBUG nova.network.neutron [req-f2105737-e576-491a-9f1f-eb11b64decb0 req-90e89e04-70d4-4b63-b9dd-50a71d35d670 service nova] [instance: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1] Updated VIF entry in instance network info cache for port 0b9909b1-cbc2-4a32-9744-599b789730dc. {{(pid=71474) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 21 14:06:47 user nova-compute[71474]: DEBUG nova.network.neutron [req-f2105737-e576-491a-9f1f-eb11b64decb0 req-90e89e04-70d4-4b63-b9dd-50a71d35d670 service nova] [instance: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1] Updating instance_info_cache with network_info: [{"id": "0b9909b1-cbc2-4a32-9744-599b789730dc", "address": "fa:16:3e:47:d6:29", "network": {"id": "31b07b9f-0a0f-426a-97d6-12b23e611818", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1809206062-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f0ccc2c950364fcbb0f2b1cc937f6a82", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b9909b1-cb", "ovs_interfaceid": "0b9909b1-cbc2-4a32-9744-599b789730dc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:06:47 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-f2105737-e576-491a-9f1f-eb11b64decb0 req-90e89e04-70d4-4b63-b9dd-50a71d35d670 service nova] Releasing lock "refresh_cache-91696ea3-6e52-4506-ba4d-7f87f7b9f5b1" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 14:06:48 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:06:48 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:06:48 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:06:48 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:06:49 user nova-compute[71474]: DEBUG nova.compute.manager [req-5f07eef1-4b2e-4154-bcb6-4e42dfaf97ad req-98dfd0a4-d1a0-41f8-ae51-0260c3709b9d service nova] [instance: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1] Received event network-vif-plugged-0b9909b1-cbc2-4a32-9744-599b789730dc {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:06:49 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-5f07eef1-4b2e-4154-bcb6-4e42dfaf97ad req-98dfd0a4-d1a0-41f8-ae51-0260c3709b9d service nova] Acquiring lock "91696ea3-6e52-4506-ba4d-7f87f7b9f5b1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:06:49 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-5f07eef1-4b2e-4154-bcb6-4e42dfaf97ad req-98dfd0a4-d1a0-41f8-ae51-0260c3709b9d service nova] Lock "91696ea3-6e52-4506-ba4d-7f87f7b9f5b1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:06:49 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-5f07eef1-4b2e-4154-bcb6-4e42dfaf97ad req-98dfd0a4-d1a0-41f8-ae51-0260c3709b9d service nova] Lock "91696ea3-6e52-4506-ba4d-7f87f7b9f5b1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:06:49 user nova-compute[71474]: DEBUG nova.compute.manager [req-5f07eef1-4b2e-4154-bcb6-4e42dfaf97ad req-98dfd0a4-d1a0-41f8-ae51-0260c3709b9d service nova] [instance: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1] No waiting events found dispatching network-vif-plugged-0b9909b1-cbc2-4a32-9744-599b789730dc {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:06:49 user nova-compute[71474]: WARNING nova.compute.manager [req-5f07eef1-4b2e-4154-bcb6-4e42dfaf97ad req-98dfd0a4-d1a0-41f8-ae51-0260c3709b9d service nova] [instance: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1] Received unexpected event network-vif-plugged-0b9909b1-cbc2-4a32-9744-599b789730dc for instance with vm_state building and task_state spawning. Apr 21 14:06:49 user nova-compute[71474]: DEBUG nova.compute.manager [req-5f07eef1-4b2e-4154-bcb6-4e42dfaf97ad req-98dfd0a4-d1a0-41f8-ae51-0260c3709b9d service nova] [instance: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1] Received event network-vif-plugged-0b9909b1-cbc2-4a32-9744-599b789730dc {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:06:49 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-5f07eef1-4b2e-4154-bcb6-4e42dfaf97ad req-98dfd0a4-d1a0-41f8-ae51-0260c3709b9d service nova] Acquiring lock "91696ea3-6e52-4506-ba4d-7f87f7b9f5b1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:06:49 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-5f07eef1-4b2e-4154-bcb6-4e42dfaf97ad req-98dfd0a4-d1a0-41f8-ae51-0260c3709b9d service nova] Lock "91696ea3-6e52-4506-ba4d-7f87f7b9f5b1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:06:49 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-5f07eef1-4b2e-4154-bcb6-4e42dfaf97ad req-98dfd0a4-d1a0-41f8-ae51-0260c3709b9d service nova] Lock "91696ea3-6e52-4506-ba4d-7f87f7b9f5b1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:06:49 user nova-compute[71474]: DEBUG nova.compute.manager [req-5f07eef1-4b2e-4154-bcb6-4e42dfaf97ad req-98dfd0a4-d1a0-41f8-ae51-0260c3709b9d service nova] [instance: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1] No waiting events found dispatching network-vif-plugged-0b9909b1-cbc2-4a32-9744-599b789730dc {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:06:49 user nova-compute[71474]: WARNING nova.compute.manager [req-5f07eef1-4b2e-4154-bcb6-4e42dfaf97ad req-98dfd0a4-d1a0-41f8-ae51-0260c3709b9d service nova] [instance: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1] Received unexpected event network-vif-plugged-0b9909b1-cbc2-4a32-9744-599b789730dc for instance with vm_state building and task_state spawning. Apr 21 14:06:49 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:06:49 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:06:49 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:06:50 user nova-compute[71474]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71474) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 14:06:51 user nova-compute[71474]: INFO nova.compute.manager [-] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] VM Stopped (Lifecycle Event) Apr 21 14:06:51 user nova-compute[71474]: DEBUG nova.compute.manager [None req-c5b3dac8-9c81-410a-9f19-e42eefa64020 None None] [instance: 4a44d9f3-28b2-45e7-b952-2bb1735ef5b5] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 14:06:51 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:06:51 user nova-compute[71474]: DEBUG nova.virt.driver [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Emitting event Resumed> {{(pid=71474) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 14:06:51 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1] VM Resumed (Lifecycle Event) Apr 21 14:06:51 user nova-compute[71474]: DEBUG nova.compute.manager [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1] Instance event wait completed in 0 seconds for {{(pid=71474) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 21 14:06:51 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1] Guest created on hypervisor {{(pid=71474) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 21 14:06:51 user nova-compute[71474]: INFO nova.virt.libvirt.driver [-] [instance: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1] Instance spawned successfully. Apr 21 14:06:51 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 21 14:06:51 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 14:06:51 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71474) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 14:06:51 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1] Found default for hw_cdrom_bus of ide {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:06:51 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1] Found default for hw_disk_bus of virtio {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:06:51 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1] Found default for hw_input_bus of None {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:06:51 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1] Found default for hw_pointer_model of None {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:06:51 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1] Found default for hw_video_model of virtio {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:06:51 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1] Found default for hw_vif_model of virtio {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:06:51 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 14:06:51 user nova-compute[71474]: DEBUG nova.virt.driver [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Emitting event Started> {{(pid=71474) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 14:06:51 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1] VM Started (Lifecycle Event) Apr 21 14:06:51 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 14:06:51 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71474) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 14:06:51 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 14:06:51 user nova-compute[71474]: INFO nova.compute.manager [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1] Took 6.01 seconds to spawn the instance on the hypervisor. Apr 21 14:06:51 user nova-compute[71474]: DEBUG nova.compute.manager [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 14:06:51 user nova-compute[71474]: INFO nova.compute.manager [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1] Took 6.71 seconds to build instance. Apr 21 14:06:51 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-c25a10af-9723-4d60-bc2c-1722c38efc3b tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Lock "91696ea3-6e52-4506-ba4d-7f87f7b9f5b1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.800s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:06:52 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:06:56 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:06:57 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:06:57 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:06:57 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:06:57 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:06:57 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:06:57 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:06:57 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:06:57 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Auditing locally available compute resources for user (node: user) {{(pid=71474) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 21 14:06:57 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/91696ea3-6e52-4506-ba4d-7f87f7b9f5b1/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:06:58 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/91696ea3-6e52-4506-ba4d-7f87f7b9f5b1/disk --force-share --output=json" returned: 0 in 0.130s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:06:58 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/91696ea3-6e52-4506-ba4d-7f87f7b9f5b1/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:06:58 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/91696ea3-6e52-4506-ba4d-7f87f7b9f5b1/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:06:58 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/80eb182f-948b-42d3-999b-339c5d615a73/disk.rescue --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:06:58 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/80eb182f-948b-42d3-999b-339c5d615a73/disk.rescue --force-share --output=json" returned: 0 in 0.134s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:06:58 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/80eb182f-948b-42d3-999b-339c5d615a73/disk.rescue --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:06:58 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/80eb182f-948b-42d3-999b-339c5d615a73/disk.rescue --force-share --output=json" returned: 0 in 0.132s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:06:58 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/80eb182f-948b-42d3-999b-339c5d615a73/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:06:58 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/80eb182f-948b-42d3-999b-339c5d615a73/disk --force-share --output=json" returned: 0 in 0.144s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:06:58 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/80eb182f-948b-42d3-999b-339c5d615a73/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:06:58 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/80eb182f-948b-42d3-999b-339c5d615a73/disk --force-share --output=json" returned: 0 in 0.139s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:06:58 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a205a2a4-c0de-4c5c-abc4-7b034070e014/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:06:58 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a205a2a4-c0de-4c5c-abc4-7b034070e014/disk --force-share --output=json" returned: 0 in 0.133s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:06:58 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a205a2a4-c0de-4c5c-abc4-7b034070e014/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:06:59 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a205a2a4-c0de-4c5c-abc4-7b034070e014/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:06:59 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b5e2e065-1b7d-4cbf-b31a-923ae2f92fff/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:06:59 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b5e2e065-1b7d-4cbf-b31a-923ae2f92fff/disk --force-share --output=json" returned: 0 in 0.132s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:06:59 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b5e2e065-1b7d-4cbf-b31a-923ae2f92fff/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:06:59 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b5e2e065-1b7d-4cbf-b31a-923ae2f92fff/disk --force-share --output=json" returned: 0 in 0.132s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:06:59 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5e502c4c-a46b-4670-acba-2fda2d05adf5/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:06:59 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5e502c4c-a46b-4670-acba-2fda2d05adf5/disk --force-share --output=json" returned: 0 in 0.134s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:06:59 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5e502c4c-a46b-4670-acba-2fda2d05adf5/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:06:59 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5e502c4c-a46b-4670-acba-2fda2d05adf5/disk --force-share --output=json" returned: 0 in 0.130s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:07:00 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 14:07:00 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 14:07:00 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Hypervisor/Node resource view: name=user free_ram=8497MB free_disk=26.026012420654297GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71474) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 21 14:07:00 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:07:00 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:07:00 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Instance 5e502c4c-a46b-4670-acba-2fda2d05adf5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71474) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 14:07:00 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Instance a205a2a4-c0de-4c5c-abc4-7b034070e014 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71474) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 14:07:00 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Instance 80eb182f-948b-42d3-999b-339c5d615a73 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71474) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 14:07:00 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Instance b5e2e065-1b7d-4cbf-b31a-923ae2f92fff actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71474) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 14:07:00 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Instance 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71474) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 14:07:00 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Total usable vcpus: 12, total allocated vcpus: 5 {{(pid=71474) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 21 14:07:00 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Final resource view: name=user phys_ram=16023MB used_ram=1152MB phys_disk=40GB used_disk=5GB total_vcpus=12 used_vcpus=5 pci_stats=[] {{(pid=71474) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 21 14:07:00 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Inventory has not changed in ProviderTree for provider: 4e62c1ab-67bb-43ed-8389-61deb50e98d7 {{(pid=71474) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 14:07:00 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Inventory has not changed for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 14:07:00 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Compute_service record updated for user:user {{(pid=71474) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 21 14:07:00 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.302s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:07:00 user nova-compute[71474]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71474) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 14:07:00 user nova-compute[71474]: INFO nova.compute.manager [-] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] VM Stopped (Lifecycle Event) Apr 21 14:07:00 user nova-compute[71474]: DEBUG nova.compute.manager [None req-daf07f5d-b945-426d-ad96-ccd4fbfa591e None None] [instance: 30068c4a-94ed-4b84-9178-0d554326fc68] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 14:07:01 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:07:01 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Starting heal instance info cache {{(pid=71474) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 21 14:07:01 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquiring lock "refresh_cache-5e502c4c-a46b-4670-acba-2fda2d05adf5" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 14:07:01 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquired lock "refresh_cache-5e502c4c-a46b-4670-acba-2fda2d05adf5" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 14:07:01 user nova-compute[71474]: DEBUG nova.network.neutron [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] Forcefully refreshing network info cache for instance {{(pid=71474) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 21 14:07:02 user nova-compute[71474]: DEBUG nova.network.neutron [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] Updating instance_info_cache with network_info: [{"id": "9ba354a7-6fb2-4eb1-96f4-edb58950895e", "address": "fa:16:3e:2a:5f:60", "network": {"id": "4b38afb7-2b53-44fc-a4e0-7d79bef71734", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-935140606-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "885cdc1521a14985bfa70ae21e73c693", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ba354a7-6f", "ovs_interfaceid": "9ba354a7-6fb2-4eb1-96f4-edb58950895e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:07:02 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Releasing lock "refresh_cache-5e502c4c-a46b-4670-acba-2fda2d05adf5" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 14:07:02 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] Updated the network info_cache for instance {{(pid=71474) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 21 14:07:02 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:07:02 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:07:02 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 14:07:02 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:07:02 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:07:02 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71474) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 21 14:07:05 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:07:06 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:07:07 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:07:11 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:07:12 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:07:16 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:07:17 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:07:20 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Acquiring lock "eb793e62-10c7-4bc3-834b-4a046bd33462" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:07:20 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Lock "eb793e62-10c7-4bc3-834b-4a046bd33462" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:07:20 user nova-compute[71474]: DEBUG nova.compute.manager [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] [instance: eb793e62-10c7-4bc3-834b-4a046bd33462] Starting instance... {{(pid=71474) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 21 14:07:20 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:07:20 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:07:20 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71474) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 21 14:07:20 user nova-compute[71474]: INFO nova.compute.claims [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] [instance: eb793e62-10c7-4bc3-834b-4a046bd33462] Claim successful on node user Apr 21 14:07:21 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Inventory has not changed in ProviderTree for provider: 4e62c1ab-67bb-43ed-8389-61deb50e98d7 {{(pid=71474) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 14:07:21 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:07:21 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Inventory has not changed for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 14:07:21 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.326s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:07:21 user nova-compute[71474]: DEBUG nova.compute.manager [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] [instance: eb793e62-10c7-4bc3-834b-4a046bd33462] Start building networks asynchronously for instance. {{(pid=71474) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 21 14:07:21 user nova-compute[71474]: DEBUG nova.compute.manager [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] [instance: eb793e62-10c7-4bc3-834b-4a046bd33462] Allocating IP information in the background. {{(pid=71474) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 21 14:07:21 user nova-compute[71474]: DEBUG nova.network.neutron [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] [instance: eb793e62-10c7-4bc3-834b-4a046bd33462] allocate_for_instance() {{(pid=71474) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 21 14:07:21 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] [instance: eb793e62-10c7-4bc3-834b-4a046bd33462] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 21 14:07:21 user nova-compute[71474]: DEBUG nova.compute.manager [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] [instance: eb793e62-10c7-4bc3-834b-4a046bd33462] Start building block device mappings for instance. {{(pid=71474) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 21 14:07:21 user nova-compute[71474]: DEBUG nova.policy [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9d40cdc3312b43d286d8a79cde9f5418', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'cfa1f4e6f7864477b911420ea2ecb982', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71474) authorize /opt/stack/nova/nova/policy.py:203}} Apr 21 14:07:21 user nova-compute[71474]: DEBUG nova.compute.manager [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] [instance: eb793e62-10c7-4bc3-834b-4a046bd33462] Start spawning the instance on the hypervisor. {{(pid=71474) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 21 14:07:21 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] [instance: eb793e62-10c7-4bc3-834b-4a046bd33462] Creating instance directory {{(pid=71474) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 21 14:07:21 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] [instance: eb793e62-10c7-4bc3-834b-4a046bd33462] Creating image(s) Apr 21 14:07:21 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Acquiring lock "/opt/stack/data/nova/instances/eb793e62-10c7-4bc3-834b-4a046bd33462/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:07:21 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Lock "/opt/stack/data/nova/instances/eb793e62-10c7-4bc3-834b-4a046bd33462/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:07:21 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Lock "/opt/stack/data/nova/instances/eb793e62-10c7-4bc3-834b-4a046bd33462/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:07:21 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Acquiring lock "e98b298ec29d31df736aae2ce4638bff7782df4f" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:07:21 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Lock "e98b298ec29d31df736aae2ce4638bff7782df4f" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:07:21 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/e98b298ec29d31df736aae2ce4638bff7782df4f.part --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:07:21 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/e98b298ec29d31df736aae2ce4638bff7782df4f.part --force-share --output=json" returned: 0 in 0.138s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:07:21 user nova-compute[71474]: DEBUG nova.virt.images [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] 6d7eb54a-b068-4162-bd98-5a21649fbd2b was qcow2, converting to raw {{(pid=71474) fetch_to_raw /opt/stack/nova/nova/virt/images.py:165}} Apr 21 14:07:21 user nova-compute[71474]: DEBUG nova.privsep.utils [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=71474) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Apr 21 14:07:21 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/e98b298ec29d31df736aae2ce4638bff7782df4f.part /opt/stack/data/nova/instances/_base/e98b298ec29d31df736aae2ce4638bff7782df4f.converted {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:07:22 user nova-compute[71474]: DEBUG nova.network.neutron [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] [instance: eb793e62-10c7-4bc3-834b-4a046bd33462] Successfully created port: 5aa6dd25-1817-44da-9879-ccebac68be61 {{(pid=71474) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 21 14:07:22 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] CMD "qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/e98b298ec29d31df736aae2ce4638bff7782df4f.part /opt/stack/data/nova/instances/_base/e98b298ec29d31df736aae2ce4638bff7782df4f.converted" returned: 0 in 0.161s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:07:22 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/e98b298ec29d31df736aae2ce4638bff7782df4f.converted --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:07:22 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/e98b298ec29d31df736aae2ce4638bff7782df4f.converted --force-share --output=json" returned: 0 in 0.136s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:07:22 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Lock "e98b298ec29d31df736aae2ce4638bff7782df4f" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 0.938s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:07:22 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/e98b298ec29d31df736aae2ce4638bff7782df4f --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:07:22 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/e98b298ec29d31df736aae2ce4638bff7782df4f --force-share --output=json" returned: 0 in 0.135s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:07:22 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Acquiring lock "e98b298ec29d31df736aae2ce4638bff7782df4f" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:07:22 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Lock "e98b298ec29d31df736aae2ce4638bff7782df4f" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:07:22 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/e98b298ec29d31df736aae2ce4638bff7782df4f --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:07:22 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:07:22 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/e98b298ec29d31df736aae2ce4638bff7782df4f --force-share --output=json" returned: 0 in 0.147s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:07:22 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/e98b298ec29d31df736aae2ce4638bff7782df4f,backing_fmt=raw /opt/stack/data/nova/instances/eb793e62-10c7-4bc3-834b-4a046bd33462/disk 1073741824 {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:07:22 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/e98b298ec29d31df736aae2ce4638bff7782df4f,backing_fmt=raw /opt/stack/data/nova/instances/eb793e62-10c7-4bc3-834b-4a046bd33462/disk 1073741824" returned: 0 in 0.340s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:07:22 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Lock "e98b298ec29d31df736aae2ce4638bff7782df4f" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.492s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:07:22 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/e98b298ec29d31df736aae2ce4638bff7782df4f --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:07:22 user nova-compute[71474]: DEBUG nova.network.neutron [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] [instance: eb793e62-10c7-4bc3-834b-4a046bd33462] Successfully updated port: 5aa6dd25-1817-44da-9879-ccebac68be61 {{(pid=71474) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 21 14:07:22 user nova-compute[71474]: DEBUG nova.compute.manager [req-298ee2a8-368c-4b1d-980e-f0fe92c86044 req-c2e8906e-2f34-46a1-b37a-f859515ef5b2 service nova] [instance: eb793e62-10c7-4bc3-834b-4a046bd33462] Received event network-changed-5aa6dd25-1817-44da-9879-ccebac68be61 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:07:22 user nova-compute[71474]: DEBUG nova.compute.manager [req-298ee2a8-368c-4b1d-980e-f0fe92c86044 req-c2e8906e-2f34-46a1-b37a-f859515ef5b2 service nova] [instance: eb793e62-10c7-4bc3-834b-4a046bd33462] Refreshing instance network info cache due to event network-changed-5aa6dd25-1817-44da-9879-ccebac68be61. {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 21 14:07:22 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-298ee2a8-368c-4b1d-980e-f0fe92c86044 req-c2e8906e-2f34-46a1-b37a-f859515ef5b2 service nova] Acquiring lock "refresh_cache-eb793e62-10c7-4bc3-834b-4a046bd33462" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 14:07:22 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-298ee2a8-368c-4b1d-980e-f0fe92c86044 req-c2e8906e-2f34-46a1-b37a-f859515ef5b2 service nova] Acquired lock "refresh_cache-eb793e62-10c7-4bc3-834b-4a046bd33462" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 14:07:22 user nova-compute[71474]: DEBUG nova.network.neutron [req-298ee2a8-368c-4b1d-980e-f0fe92c86044 req-c2e8906e-2f34-46a1-b37a-f859515ef5b2 service nova] [instance: eb793e62-10c7-4bc3-834b-4a046bd33462] Refreshing network info cache for port 5aa6dd25-1817-44da-9879-ccebac68be61 {{(pid=71474) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 21 14:07:22 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Acquiring lock "refresh_cache-eb793e62-10c7-4bc3-834b-4a046bd33462" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 14:07:23 user nova-compute[71474]: DEBUG nova.network.neutron [req-298ee2a8-368c-4b1d-980e-f0fe92c86044 req-c2e8906e-2f34-46a1-b37a-f859515ef5b2 service nova] [instance: eb793e62-10c7-4bc3-834b-4a046bd33462] Instance cache missing network info. {{(pid=71474) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 21 14:07:23 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/e98b298ec29d31df736aae2ce4638bff7782df4f --force-share --output=json" returned: 0 in 0.133s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:07:23 user nova-compute[71474]: DEBUG nova.virt.disk.api [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Checking if we can resize image /opt/stack/data/nova/instances/eb793e62-10c7-4bc3-834b-4a046bd33462/disk. size=1073741824 {{(pid=71474) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 21 14:07:23 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/eb793e62-10c7-4bc3-834b-4a046bd33462/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:07:23 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/eb793e62-10c7-4bc3-834b-4a046bd33462/disk --force-share --output=json" returned: 0 in 0.149s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:07:23 user nova-compute[71474]: DEBUG nova.virt.disk.api [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Cannot resize image /opt/stack/data/nova/instances/eb793e62-10c7-4bc3-834b-4a046bd33462/disk to a smaller size. {{(pid=71474) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 21 14:07:23 user nova-compute[71474]: DEBUG nova.objects.instance [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Lazy-loading 'migration_context' on Instance uuid eb793e62-10c7-4bc3-834b-4a046bd33462 {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 14:07:23 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] [instance: eb793e62-10c7-4bc3-834b-4a046bd33462] Created local disks {{(pid=71474) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 21 14:07:23 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] [instance: eb793e62-10c7-4bc3-834b-4a046bd33462] Ensure instance console log exists: /opt/stack/data/nova/instances/eb793e62-10c7-4bc3-834b-4a046bd33462/console.log {{(pid=71474) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 21 14:07:23 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:07:23 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:07:23 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:07:23 user nova-compute[71474]: DEBUG nova.network.neutron [req-298ee2a8-368c-4b1d-980e-f0fe92c86044 req-c2e8906e-2f34-46a1-b37a-f859515ef5b2 service nova] [instance: eb793e62-10c7-4bc3-834b-4a046bd33462] Updating instance_info_cache with network_info: [] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:07:23 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-298ee2a8-368c-4b1d-980e-f0fe92c86044 req-c2e8906e-2f34-46a1-b37a-f859515ef5b2 service nova] Releasing lock "refresh_cache-eb793e62-10c7-4bc3-834b-4a046bd33462" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 14:07:23 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Acquired lock "refresh_cache-eb793e62-10c7-4bc3-834b-4a046bd33462" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 14:07:23 user nova-compute[71474]: DEBUG nova.network.neutron [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] [instance: eb793e62-10c7-4bc3-834b-4a046bd33462] Building network info cache for instance {{(pid=71474) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 21 14:07:23 user nova-compute[71474]: DEBUG nova.network.neutron [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] [instance: eb793e62-10c7-4bc3-834b-4a046bd33462] Instance cache missing network info. {{(pid=71474) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 21 14:07:24 user nova-compute[71474]: DEBUG nova.network.neutron [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] [instance: eb793e62-10c7-4bc3-834b-4a046bd33462] Updating instance_info_cache with network_info: [{"id": "5aa6dd25-1817-44da-9879-ccebac68be61", "address": "fa:16:3e:03:9a:93", "network": {"id": "12b23d1e-f3a6-4c34-989d-1c89ee946e24", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1972551477-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cfa1f4e6f7864477b911420ea2ecb982", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5aa6dd25-18", "ovs_interfaceid": "5aa6dd25-1817-44da-9879-ccebac68be61", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:07:24 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Releasing lock "refresh_cache-eb793e62-10c7-4bc3-834b-4a046bd33462" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 14:07:24 user nova-compute[71474]: DEBUG nova.compute.manager [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] [instance: eb793e62-10c7-4bc3-834b-4a046bd33462] Instance network_info: |[{"id": "5aa6dd25-1817-44da-9879-ccebac68be61", "address": "fa:16:3e:03:9a:93", "network": {"id": "12b23d1e-f3a6-4c34-989d-1c89ee946e24", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1972551477-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cfa1f4e6f7864477b911420ea2ecb982", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5aa6dd25-18", "ovs_interfaceid": "5aa6dd25-1817-44da-9879-ccebac68be61", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71474) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 21 14:07:24 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] [instance: eb793e62-10c7-4bc3-834b-4a046bd33462] Start _get_guest_xml network_info=[{"id": "5aa6dd25-1817-44da-9879-ccebac68be61", "address": "fa:16:3e:03:9a:93", "network": {"id": "12b23d1e-f3a6-4c34-989d-1c89ee946e24", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1972551477-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cfa1f4e6f7864477b911420ea2ecb982", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5aa6dd25-18", "ovs_interfaceid": "5aa6dd25-1817-44da-9879-ccebac68be61", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T14:07:18Z,direct_url=,disk_format='qcow2',id=6d7eb54a-b068-4162-bd98-5a21649fbd2b,min_disk=0,min_ram=0,name='tempest-scenario-img--916643422',owner='cfa1f4e6f7864477b911420ea2ecb982',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T14:07:19Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'guest_format': None, 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'image_id': '6d7eb54a-b068-4162-bd98-5a21649fbd2b'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71474) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 21 14:07:24 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 14:07:24 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 14:07:24 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71474) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 21 14:07:24 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-21T13:55:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T14:07:18Z,direct_url=,disk_format='qcow2',id=6d7eb54a-b068-4162-bd98-5a21649fbd2b,min_disk=0,min_ram=0,name='tempest-scenario-img--916643422',owner='cfa1f4e6f7864477b911420ea2ecb982',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T14:07:19Z,virtual_size=,visibility=), allow threads: True {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 21 14:07:24 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Flavor limits 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 21 14:07:24 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Image limits 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 21 14:07:24 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Flavor pref 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 21 14:07:24 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Image pref 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 21 14:07:24 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 21 14:07:24 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 21 14:07:24 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71474) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 21 14:07:24 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Got 1 possible topologies {{(pid=71474) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 21 14:07:24 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 21 14:07:24 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 21 14:07:24 user nova-compute[71474]: DEBUG nova.virt.libvirt.vif [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T14:07:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestMinimumBasicScenario-server-2134957043',display_name='tempest-TestMinimumBasicScenario-server-2134957043',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-testminimumbasicscenario-server-2134957043',id=23,image_ref='6d7eb54a-b068-4162-bd98-5a21649fbd2b',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCO55/5HhCQZkcFVfgVGrulI3atlpFKQjKnI78/kM4uimAx0bJQlbEfcxM2XQeW2c86vQX6nUo1+E0FJJHQNMap4lIvFfRYGCEovcOc6baFnAovI9vNKGBK33RQ9htMU1w==',key_name='tempest-TestMinimumBasicScenario-788075712',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cfa1f4e6f7864477b911420ea2ecb982',ramdisk_id='',reservation_id='r-bcbw0udz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6d7eb54a-b068-4162-bd98-5a21649fbd2b',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestMinimumBasicScenario-515927679',owner_user_name='tempest-TestMinimumBasicScenario-515927679-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T14:07:21Z,user_data=None,user_id='9d40cdc3312b43d286d8a79cde9f5418',uuid=eb793e62-10c7-4bc3-834b-4a046bd33462,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5aa6dd25-1817-44da-9879-ccebac68be61", "address": "fa:16:3e:03:9a:93", "network": {"id": "12b23d1e-f3a6-4c34-989d-1c89ee946e24", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1972551477-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cfa1f4e6f7864477b911420ea2ecb982", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5aa6dd25-18", "ovs_interfaceid": "5aa6dd25-1817-44da-9879-ccebac68be61", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71474) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 21 14:07:24 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Converting VIF {"id": "5aa6dd25-1817-44da-9879-ccebac68be61", "address": "fa:16:3e:03:9a:93", "network": {"id": "12b23d1e-f3a6-4c34-989d-1c89ee946e24", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1972551477-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cfa1f4e6f7864477b911420ea2ecb982", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5aa6dd25-18", "ovs_interfaceid": "5aa6dd25-1817-44da-9879-ccebac68be61", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 14:07:24 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:03:9a:93,bridge_name='br-int',has_traffic_filtering=True,id=5aa6dd25-1817-44da-9879-ccebac68be61,network=Network(12b23d1e-f3a6-4c34-989d-1c89ee946e24),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5aa6dd25-18') {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 14:07:24 user nova-compute[71474]: DEBUG nova.objects.instance [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Lazy-loading 'pci_devices' on Instance uuid eb793e62-10c7-4bc3-834b-4a046bd33462 {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 14:07:24 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] [instance: eb793e62-10c7-4bc3-834b-4a046bd33462] End _get_guest_xml xml= Apr 21 14:07:24 user nova-compute[71474]: eb793e62-10c7-4bc3-834b-4a046bd33462 Apr 21 14:07:24 user nova-compute[71474]: instance-00000017 Apr 21 14:07:24 user nova-compute[71474]: 131072 Apr 21 14:07:24 user nova-compute[71474]: 1 Apr 21 14:07:24 user nova-compute[71474]: Apr 21 14:07:24 user nova-compute[71474]: Apr 21 14:07:24 user nova-compute[71474]: Apr 21 14:07:24 user nova-compute[71474]: tempest-TestMinimumBasicScenario-server-2134957043 Apr 21 14:07:24 user nova-compute[71474]: 2023-04-21 14:07:24 Apr 21 14:07:24 user nova-compute[71474]: Apr 21 14:07:24 user nova-compute[71474]: 128 Apr 21 14:07:24 user nova-compute[71474]: 1 Apr 21 14:07:24 user nova-compute[71474]: 0 Apr 21 14:07:24 user nova-compute[71474]: 0 Apr 21 14:07:24 user nova-compute[71474]: 1 Apr 21 14:07:24 user nova-compute[71474]: Apr 21 14:07:24 user nova-compute[71474]: Apr 21 14:07:24 user nova-compute[71474]: tempest-TestMinimumBasicScenario-515927679-project-member Apr 21 14:07:24 user nova-compute[71474]: tempest-TestMinimumBasicScenario-515927679 Apr 21 14:07:24 user nova-compute[71474]: Apr 21 14:07:24 user nova-compute[71474]: Apr 21 14:07:24 user nova-compute[71474]: Apr 21 14:07:24 user nova-compute[71474]: Apr 21 14:07:24 user nova-compute[71474]: Apr 21 14:07:24 user nova-compute[71474]: Apr 21 14:07:24 user nova-compute[71474]: Apr 21 14:07:24 user nova-compute[71474]: Apr 21 14:07:24 user nova-compute[71474]: Apr 21 14:07:24 user nova-compute[71474]: Apr 21 14:07:24 user nova-compute[71474]: Apr 21 14:07:24 user nova-compute[71474]: OpenStack Foundation Apr 21 14:07:24 user nova-compute[71474]: OpenStack Nova Apr 21 14:07:24 user nova-compute[71474]: 0.0.0 Apr 21 14:07:24 user nova-compute[71474]: eb793e62-10c7-4bc3-834b-4a046bd33462 Apr 21 14:07:24 user nova-compute[71474]: eb793e62-10c7-4bc3-834b-4a046bd33462 Apr 21 14:07:24 user nova-compute[71474]: Virtual Machine Apr 21 14:07:24 user nova-compute[71474]: Apr 21 14:07:24 user nova-compute[71474]: Apr 21 14:07:24 user nova-compute[71474]: Apr 21 14:07:24 user nova-compute[71474]: hvm Apr 21 14:07:24 user nova-compute[71474]: Apr 21 14:07:24 user nova-compute[71474]: Apr 21 14:07:24 user nova-compute[71474]: Apr 21 14:07:24 user nova-compute[71474]: Apr 21 14:07:24 user nova-compute[71474]: Apr 21 14:07:24 user nova-compute[71474]: Apr 21 14:07:24 user nova-compute[71474]: Apr 21 14:07:24 user nova-compute[71474]: Apr 21 14:07:24 user nova-compute[71474]: Apr 21 14:07:24 user nova-compute[71474]: Apr 21 14:07:24 user nova-compute[71474]: Apr 21 14:07:24 user nova-compute[71474]: Apr 21 14:07:24 user nova-compute[71474]: Apr 21 14:07:24 user nova-compute[71474]: Apr 21 14:07:24 user nova-compute[71474]: Nehalem Apr 21 14:07:24 user nova-compute[71474]: Apr 21 14:07:24 user nova-compute[71474]: Apr 21 14:07:24 user nova-compute[71474]: Apr 21 14:07:24 user nova-compute[71474]: Apr 21 14:07:24 user nova-compute[71474]: Apr 21 14:07:24 user nova-compute[71474]: Apr 21 14:07:24 user nova-compute[71474]: Apr 21 14:07:24 user nova-compute[71474]: Apr 21 14:07:24 user nova-compute[71474]: Apr 21 14:07:24 user nova-compute[71474]: Apr 21 14:07:24 user nova-compute[71474]: Apr 21 14:07:24 user nova-compute[71474]: Apr 21 14:07:24 user nova-compute[71474]: Apr 21 14:07:24 user nova-compute[71474]: Apr 21 14:07:24 user nova-compute[71474]: Apr 21 14:07:24 user nova-compute[71474]: Apr 21 14:07:24 user nova-compute[71474]: Apr 21 14:07:24 user nova-compute[71474]: Apr 21 14:07:24 user nova-compute[71474]: Apr 21 14:07:24 user nova-compute[71474]: Apr 21 14:07:24 user nova-compute[71474]: /dev/urandom Apr 21 14:07:24 user nova-compute[71474]: Apr 21 14:07:24 user nova-compute[71474]: Apr 21 14:07:24 user nova-compute[71474]: Apr 21 14:07:24 user nova-compute[71474]: Apr 21 14:07:24 user nova-compute[71474]: Apr 21 14:07:24 user nova-compute[71474]: Apr 21 14:07:24 user nova-compute[71474]: Apr 21 14:07:24 user nova-compute[71474]: {{(pid=71474) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 21 14:07:24 user nova-compute[71474]: DEBUG nova.virt.libvirt.vif [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T14:07:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestMinimumBasicScenario-server-2134957043',display_name='tempest-TestMinimumBasicScenario-server-2134957043',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-testminimumbasicscenario-server-2134957043',id=23,image_ref='6d7eb54a-b068-4162-bd98-5a21649fbd2b',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCO55/5HhCQZkcFVfgVGrulI3atlpFKQjKnI78/kM4uimAx0bJQlbEfcxM2XQeW2c86vQX6nUo1+E0FJJHQNMap4lIvFfRYGCEovcOc6baFnAovI9vNKGBK33RQ9htMU1w==',key_name='tempest-TestMinimumBasicScenario-788075712',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cfa1f4e6f7864477b911420ea2ecb982',ramdisk_id='',reservation_id='r-bcbw0udz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6d7eb54a-b068-4162-bd98-5a21649fbd2b',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestMinimumBasicScenario-515927679',owner_user_name='tempest-TestMinimumBasicScenario-515927679-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T14:07:21Z,user_data=None,user_id='9d40cdc3312b43d286d8a79cde9f5418',uuid=eb793e62-10c7-4bc3-834b-4a046bd33462,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5aa6dd25-1817-44da-9879-ccebac68be61", "address": "fa:16:3e:03:9a:93", "network": {"id": "12b23d1e-f3a6-4c34-989d-1c89ee946e24", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1972551477-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cfa1f4e6f7864477b911420ea2ecb982", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5aa6dd25-18", "ovs_interfaceid": "5aa6dd25-1817-44da-9879-ccebac68be61", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 21 14:07:24 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Converting VIF {"id": "5aa6dd25-1817-44da-9879-ccebac68be61", "address": "fa:16:3e:03:9a:93", "network": {"id": "12b23d1e-f3a6-4c34-989d-1c89ee946e24", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1972551477-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cfa1f4e6f7864477b911420ea2ecb982", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5aa6dd25-18", "ovs_interfaceid": "5aa6dd25-1817-44da-9879-ccebac68be61", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 14:07:24 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:03:9a:93,bridge_name='br-int',has_traffic_filtering=True,id=5aa6dd25-1817-44da-9879-ccebac68be61,network=Network(12b23d1e-f3a6-4c34-989d-1c89ee946e24),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5aa6dd25-18') {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 14:07:24 user nova-compute[71474]: DEBUG os_vif [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:03:9a:93,bridge_name='br-int',has_traffic_filtering=True,id=5aa6dd25-1817-44da-9879-ccebac68be61,network=Network(12b23d1e-f3a6-4c34-989d-1c89ee946e24),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5aa6dd25-18') {{(pid=71474) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 21 14:07:24 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:07:24 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 14:07:24 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 21 14:07:24 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:07:24 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5aa6dd25-18, may_exist=True) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 14:07:24 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5aa6dd25-18, col_values=(('external_ids', {'iface-id': '5aa6dd25-1817-44da-9879-ccebac68be61', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:03:9a:93', 'vm-uuid': 'eb793e62-10c7-4bc3-834b-4a046bd33462'}),)) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 14:07:24 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:07:24 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 14:07:24 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:07:24 user nova-compute[71474]: INFO os_vif [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:03:9a:93,bridge_name='br-int',has_traffic_filtering=True,id=5aa6dd25-1817-44da-9879-ccebac68be61,network=Network(12b23d1e-f3a6-4c34-989d-1c89ee946e24),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5aa6dd25-18') Apr 21 14:07:24 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] No BDM found with device name vda, not building metadata. {{(pid=71474) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 21 14:07:24 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] No VIF found with MAC fa:16:3e:03:9a:93, not building metadata {{(pid=71474) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 21 14:07:25 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:07:25 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:07:25 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:07:25 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:07:25 user nova-compute[71474]: DEBUG nova.compute.manager [req-856b725b-50f0-4bc0-8830-dbaad75f5ee7 req-fe33f6ec-147a-4a59-b9f6-ec21dfe2e42e service nova] [instance: eb793e62-10c7-4bc3-834b-4a046bd33462] Received event network-vif-plugged-5aa6dd25-1817-44da-9879-ccebac68be61 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:07:25 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-856b725b-50f0-4bc0-8830-dbaad75f5ee7 req-fe33f6ec-147a-4a59-b9f6-ec21dfe2e42e service nova] Acquiring lock "eb793e62-10c7-4bc3-834b-4a046bd33462-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:07:25 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-856b725b-50f0-4bc0-8830-dbaad75f5ee7 req-fe33f6ec-147a-4a59-b9f6-ec21dfe2e42e service nova] Lock "eb793e62-10c7-4bc3-834b-4a046bd33462-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:07:25 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-856b725b-50f0-4bc0-8830-dbaad75f5ee7 req-fe33f6ec-147a-4a59-b9f6-ec21dfe2e42e service nova] Lock "eb793e62-10c7-4bc3-834b-4a046bd33462-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:07:25 user nova-compute[71474]: DEBUG nova.compute.manager [req-856b725b-50f0-4bc0-8830-dbaad75f5ee7 req-fe33f6ec-147a-4a59-b9f6-ec21dfe2e42e service nova] [instance: eb793e62-10c7-4bc3-834b-4a046bd33462] No waiting events found dispatching network-vif-plugged-5aa6dd25-1817-44da-9879-ccebac68be61 {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:07:25 user nova-compute[71474]: WARNING nova.compute.manager [req-856b725b-50f0-4bc0-8830-dbaad75f5ee7 req-fe33f6ec-147a-4a59-b9f6-ec21dfe2e42e service nova] [instance: eb793e62-10c7-4bc3-834b-4a046bd33462] Received unexpected event network-vif-plugged-5aa6dd25-1817-44da-9879-ccebac68be61 for instance with vm_state building and task_state spawning. Apr 21 14:07:25 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:07:25 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:07:26 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:07:26 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:07:26 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:07:26 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-86554e91-1f97-4d0a-8ba1-1d8089d54eda tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Acquiring lock "5e502c4c-a46b-4670-acba-2fda2d05adf5" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:07:26 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-86554e91-1f97-4d0a-8ba1-1d8089d54eda tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Lock "5e502c4c-a46b-4670-acba-2fda2d05adf5" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:07:26 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-86554e91-1f97-4d0a-8ba1-1d8089d54eda tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Acquiring lock "5e502c4c-a46b-4670-acba-2fda2d05adf5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:07:26 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-86554e91-1f97-4d0a-8ba1-1d8089d54eda tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Lock "5e502c4c-a46b-4670-acba-2fda2d05adf5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:07:26 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-86554e91-1f97-4d0a-8ba1-1d8089d54eda tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Lock "5e502c4c-a46b-4670-acba-2fda2d05adf5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:07:26 user nova-compute[71474]: INFO nova.compute.manager [None req-86554e91-1f97-4d0a-8ba1-1d8089d54eda tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] Terminating instance Apr 21 14:07:26 user nova-compute[71474]: DEBUG nova.compute.manager [None req-86554e91-1f97-4d0a-8ba1-1d8089d54eda tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] Start destroying the instance on the hypervisor. {{(pid=71474) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 21 14:07:26 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:07:26 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:07:26 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:07:26 user nova-compute[71474]: DEBUG nova.compute.manager [req-a3edbfca-65ad-4dad-a383-ead35bc45217 req-10f0cb1e-4aaf-41e7-9ca4-c2988188a9b0 service nova] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] Received event network-vif-unplugged-9ba354a7-6fb2-4eb1-96f4-edb58950895e {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:07:26 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-a3edbfca-65ad-4dad-a383-ead35bc45217 req-10f0cb1e-4aaf-41e7-9ca4-c2988188a9b0 service nova] Acquiring lock "5e502c4c-a46b-4670-acba-2fda2d05adf5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:07:26 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-a3edbfca-65ad-4dad-a383-ead35bc45217 req-10f0cb1e-4aaf-41e7-9ca4-c2988188a9b0 service nova] Lock "5e502c4c-a46b-4670-acba-2fda2d05adf5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:07:26 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-a3edbfca-65ad-4dad-a383-ead35bc45217 req-10f0cb1e-4aaf-41e7-9ca4-c2988188a9b0 service nova] Lock "5e502c4c-a46b-4670-acba-2fda2d05adf5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:07:26 user nova-compute[71474]: DEBUG nova.compute.manager [req-a3edbfca-65ad-4dad-a383-ead35bc45217 req-10f0cb1e-4aaf-41e7-9ca4-c2988188a9b0 service nova] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] No waiting events found dispatching network-vif-unplugged-9ba354a7-6fb2-4eb1-96f4-edb58950895e {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:07:26 user nova-compute[71474]: DEBUG nova.compute.manager [req-a3edbfca-65ad-4dad-a383-ead35bc45217 req-10f0cb1e-4aaf-41e7-9ca4-c2988188a9b0 service nova] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] Received event network-vif-unplugged-9ba354a7-6fb2-4eb1-96f4-edb58950895e for instance with task_state deleting. {{(pid=71474) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 21 14:07:26 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:07:26 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:07:26 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:07:26 user nova-compute[71474]: INFO nova.virt.libvirt.driver [-] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] Instance destroyed successfully. Apr 21 14:07:26 user nova-compute[71474]: DEBUG nova.objects.instance [None req-86554e91-1f97-4d0a-8ba1-1d8089d54eda tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Lazy-loading 'resources' on Instance uuid 5e502c4c-a46b-4670-acba-2fda2d05adf5 {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 14:07:26 user nova-compute[71474]: DEBUG nova.virt.libvirt.vif [None req-86554e91-1f97-4d0a-8ba1-1d8089d54eda tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T13:58:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-1731146767',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverbootfromvolumestablerescuetest-server-1731146767',id=7,image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-21T13:59:04Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='885cdc1521a14985bfa70ae21e73c693',ramdisk_id='',reservation_id='r-swvv899m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-28514522',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-21T14:00:51Z,user_data=None,user_id='132913991f8c45c1adaf5db7ef7cea30',uuid=5e502c4c-a46b-4670-acba-2fda2d05adf5,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9ba354a7-6fb2-4eb1-96f4-edb58950895e", "address": "fa:16:3e:2a:5f:60", "network": {"id": "4b38afb7-2b53-44fc-a4e0-7d79bef71734", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-935140606-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "885cdc1521a14985bfa70ae21e73c693", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ba354a7-6f", "ovs_interfaceid": "9ba354a7-6fb2-4eb1-96f4-edb58950895e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 21 14:07:26 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-86554e91-1f97-4d0a-8ba1-1d8089d54eda tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Converting VIF {"id": "9ba354a7-6fb2-4eb1-96f4-edb58950895e", "address": "fa:16:3e:2a:5f:60", "network": {"id": "4b38afb7-2b53-44fc-a4e0-7d79bef71734", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-935140606-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "885cdc1521a14985bfa70ae21e73c693", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ba354a7-6f", "ovs_interfaceid": "9ba354a7-6fb2-4eb1-96f4-edb58950895e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 14:07:26 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-86554e91-1f97-4d0a-8ba1-1d8089d54eda tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:2a:5f:60,bridge_name='br-int',has_traffic_filtering=True,id=9ba354a7-6fb2-4eb1-96f4-edb58950895e,network=Network(4b38afb7-2b53-44fc-a4e0-7d79bef71734),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9ba354a7-6f') {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 14:07:26 user nova-compute[71474]: DEBUG os_vif [None req-86554e91-1f97-4d0a-8ba1-1d8089d54eda tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:2a:5f:60,bridge_name='br-int',has_traffic_filtering=True,id=9ba354a7-6fb2-4eb1-96f4-edb58950895e,network=Network(4b38afb7-2b53-44fc-a4e0-7d79bef71734),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9ba354a7-6f') {{(pid=71474) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 21 14:07:26 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:07:26 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9ba354a7-6f, bridge=br-int, if_exists=True) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 14:07:26 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:07:26 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:07:26 user nova-compute[71474]: INFO os_vif [None req-86554e91-1f97-4d0a-8ba1-1d8089d54eda tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:2a:5f:60,bridge_name='br-int',has_traffic_filtering=True,id=9ba354a7-6fb2-4eb1-96f4-edb58950895e,network=Network(4b38afb7-2b53-44fc-a4e0-7d79bef71734),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9ba354a7-6f') Apr 21 14:07:26 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-86554e91-1f97-4d0a-8ba1-1d8089d54eda tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] Deleting instance files /opt/stack/data/nova/instances/5e502c4c-a46b-4670-acba-2fda2d05adf5_del Apr 21 14:07:26 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-86554e91-1f97-4d0a-8ba1-1d8089d54eda tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] Deletion of /opt/stack/data/nova/instances/5e502c4c-a46b-4670-acba-2fda2d05adf5_del complete Apr 21 14:07:26 user nova-compute[71474]: INFO nova.compute.manager [None req-86554e91-1f97-4d0a-8ba1-1d8089d54eda tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] Took 0.84 seconds to destroy the instance on the hypervisor. Apr 21 14:07:26 user nova-compute[71474]: DEBUG oslo.service.loopingcall [None req-86554e91-1f97-4d0a-8ba1-1d8089d54eda tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71474) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 21 14:07:26 user nova-compute[71474]: DEBUG nova.compute.manager [-] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] Deallocating network for instance {{(pid=71474) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 21 14:07:26 user nova-compute[71474]: DEBUG nova.network.neutron [-] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] deallocate_for_instance() {{(pid=71474) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 21 14:07:27 user nova-compute[71474]: DEBUG nova.network.neutron [-] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] Updating instance_info_cache with network_info: [] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:07:27 user nova-compute[71474]: INFO nova.compute.manager [-] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] Took 0.43 seconds to deallocate network for instance. Apr 21 14:07:27 user nova-compute[71474]: DEBUG nova.compute.manager [req-bd5fafc6-6e96-4652-a9cd-ea5cf1de0ad9 req-1050a05c-cd27-41d7-892a-1f656dc0d2ee service nova] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] Received event network-vif-deleted-9ba354a7-6fb2-4eb1-96f4-edb58950895e {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:07:27 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-86554e91-1f97-4d0a-8ba1-1d8089d54eda tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:07:27 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-86554e91-1f97-4d0a-8ba1-1d8089d54eda tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:07:27 user nova-compute[71474]: DEBUG nova.virt.driver [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Emitting event Resumed> {{(pid=71474) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 14:07:27 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: eb793e62-10c7-4bc3-834b-4a046bd33462] VM Resumed (Lifecycle Event) Apr 21 14:07:27 user nova-compute[71474]: DEBUG nova.compute.manager [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] [instance: eb793e62-10c7-4bc3-834b-4a046bd33462] Instance event wait completed in 0 seconds for {{(pid=71474) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 21 14:07:27 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] [instance: eb793e62-10c7-4bc3-834b-4a046bd33462] Guest created on hypervisor {{(pid=71474) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 21 14:07:27 user nova-compute[71474]: INFO nova.virt.libvirt.driver [-] [instance: eb793e62-10c7-4bc3-834b-4a046bd33462] Instance spawned successfully. Apr 21 14:07:27 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] [instance: eb793e62-10c7-4bc3-834b-4a046bd33462] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 21 14:07:27 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: eb793e62-10c7-4bc3-834b-4a046bd33462] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 14:07:27 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] [instance: eb793e62-10c7-4bc3-834b-4a046bd33462] Found default for hw_cdrom_bus of ide {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:07:27 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] [instance: eb793e62-10c7-4bc3-834b-4a046bd33462] Found default for hw_disk_bus of virtio {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:07:27 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] [instance: eb793e62-10c7-4bc3-834b-4a046bd33462] Found default for hw_input_bus of None {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:07:27 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] [instance: eb793e62-10c7-4bc3-834b-4a046bd33462] Found default for hw_pointer_model of None {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:07:27 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] [instance: eb793e62-10c7-4bc3-834b-4a046bd33462] Found default for hw_video_model of virtio {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:07:27 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] [instance: eb793e62-10c7-4bc3-834b-4a046bd33462] Found default for hw_vif_model of virtio {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:07:27 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: eb793e62-10c7-4bc3-834b-4a046bd33462] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71474) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 14:07:27 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: eb793e62-10c7-4bc3-834b-4a046bd33462] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 14:07:27 user nova-compute[71474]: DEBUG nova.virt.driver [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Emitting event Started> {{(pid=71474) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 14:07:27 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: eb793e62-10c7-4bc3-834b-4a046bd33462] VM Started (Lifecycle Event) Apr 21 14:07:27 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-86554e91-1f97-4d0a-8ba1-1d8089d54eda tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Inventory has not changed in ProviderTree for provider: 4e62c1ab-67bb-43ed-8389-61deb50e98d7 {{(pid=71474) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 14:07:27 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: eb793e62-10c7-4bc3-834b-4a046bd33462] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 14:07:27 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: eb793e62-10c7-4bc3-834b-4a046bd33462] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71474) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 14:07:27 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-86554e91-1f97-4d0a-8ba1-1d8089d54eda tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Inventory has not changed for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 14:07:27 user nova-compute[71474]: DEBUG nova.compute.manager [req-7fb261aa-f062-4549-8ae6-24d50c9d6bff req-17d8cd3a-7b92-4903-912d-69422a71761b service nova] [instance: eb793e62-10c7-4bc3-834b-4a046bd33462] Received event network-vif-plugged-5aa6dd25-1817-44da-9879-ccebac68be61 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:07:27 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-7fb261aa-f062-4549-8ae6-24d50c9d6bff req-17d8cd3a-7b92-4903-912d-69422a71761b service nova] Acquiring lock "eb793e62-10c7-4bc3-834b-4a046bd33462-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:07:27 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-7fb261aa-f062-4549-8ae6-24d50c9d6bff req-17d8cd3a-7b92-4903-912d-69422a71761b service nova] Lock "eb793e62-10c7-4bc3-834b-4a046bd33462-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:07:27 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-7fb261aa-f062-4549-8ae6-24d50c9d6bff req-17d8cd3a-7b92-4903-912d-69422a71761b service nova] Lock "eb793e62-10c7-4bc3-834b-4a046bd33462-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:07:27 user nova-compute[71474]: DEBUG nova.compute.manager [req-7fb261aa-f062-4549-8ae6-24d50c9d6bff req-17d8cd3a-7b92-4903-912d-69422a71761b service nova] [instance: eb793e62-10c7-4bc3-834b-4a046bd33462] No waiting events found dispatching network-vif-plugged-5aa6dd25-1817-44da-9879-ccebac68be61 {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:07:27 user nova-compute[71474]: WARNING nova.compute.manager [req-7fb261aa-f062-4549-8ae6-24d50c9d6bff req-17d8cd3a-7b92-4903-912d-69422a71761b service nova] [instance: eb793e62-10c7-4bc3-834b-4a046bd33462] Received unexpected event network-vif-plugged-5aa6dd25-1817-44da-9879-ccebac68be61 for instance with vm_state building and task_state spawning. Apr 21 14:07:27 user nova-compute[71474]: INFO nova.compute.manager [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] [instance: eb793e62-10c7-4bc3-834b-4a046bd33462] Took 6.35 seconds to spawn the instance on the hypervisor. Apr 21 14:07:27 user nova-compute[71474]: DEBUG nova.compute.manager [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] [instance: eb793e62-10c7-4bc3-834b-4a046bd33462] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 14:07:27 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: eb793e62-10c7-4bc3-834b-4a046bd33462] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 14:07:27 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-86554e91-1f97-4d0a-8ba1-1d8089d54eda tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.264s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:07:27 user nova-compute[71474]: INFO nova.scheduler.client.report [None req-86554e91-1f97-4d0a-8ba1-1d8089d54eda tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Deleted allocations for instance 5e502c4c-a46b-4670-acba-2fda2d05adf5 Apr 21 14:07:27 user nova-compute[71474]: INFO nova.compute.manager [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] [instance: eb793e62-10c7-4bc3-834b-4a046bd33462] Took 7.07 seconds to build instance. Apr 21 14:07:27 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-7cc7c8df-0004-42e7-bf4d-a9ffceb51414 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Lock "eb793e62-10c7-4bc3-834b-4a046bd33462" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 7.164s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:07:27 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-86554e91-1f97-4d0a-8ba1-1d8089d54eda tempest-ServerBootFromVolumeStableRescueTest-28514522 tempest-ServerBootFromVolumeStableRescueTest-28514522-project-member] Lock "5e502c4c-a46b-4670-acba-2fda2d05adf5" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.741s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:07:28 user nova-compute[71474]: DEBUG nova.compute.manager [req-c11b2b2d-98f1-4e8b-b605-b76385f91620 req-5a2cbe78-4b6f-4e50-aef4-a13ca1febedc service nova] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] Received event network-vif-plugged-9ba354a7-6fb2-4eb1-96f4-edb58950895e {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:07:28 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-c11b2b2d-98f1-4e8b-b605-b76385f91620 req-5a2cbe78-4b6f-4e50-aef4-a13ca1febedc service nova] Acquiring lock "5e502c4c-a46b-4670-acba-2fda2d05adf5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:07:28 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-c11b2b2d-98f1-4e8b-b605-b76385f91620 req-5a2cbe78-4b6f-4e50-aef4-a13ca1febedc service nova] Lock "5e502c4c-a46b-4670-acba-2fda2d05adf5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:07:28 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-c11b2b2d-98f1-4e8b-b605-b76385f91620 req-5a2cbe78-4b6f-4e50-aef4-a13ca1febedc service nova] Lock "5e502c4c-a46b-4670-acba-2fda2d05adf5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:07:28 user nova-compute[71474]: DEBUG nova.compute.manager [req-c11b2b2d-98f1-4e8b-b605-b76385f91620 req-5a2cbe78-4b6f-4e50-aef4-a13ca1febedc service nova] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] No waiting events found dispatching network-vif-plugged-9ba354a7-6fb2-4eb1-96f4-edb58950895e {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:07:28 user nova-compute[71474]: WARNING nova.compute.manager [req-c11b2b2d-98f1-4e8b-b605-b76385f91620 req-5a2cbe78-4b6f-4e50-aef4-a13ca1febedc service nova] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] Received unexpected event network-vif-plugged-9ba354a7-6fb2-4eb1-96f4-edb58950895e for instance with vm_state deleted and task_state None. Apr 21 14:07:31 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:07:31 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:07:36 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:07:36 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:07:37 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:07:41 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:07:41 user nova-compute[71474]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71474) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 14:07:41 user nova-compute[71474]: INFO nova.compute.manager [-] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] VM Stopped (Lifecycle Event) Apr 21 14:07:41 user nova-compute[71474]: DEBUG nova.compute.manager [None req-00a42a3a-b919-4c65-a6fd-96a1b7b18ab1 None None] [instance: 5e502c4c-a46b-4670-acba-2fda2d05adf5] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 14:07:41 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:07:46 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:07:46 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:07:51 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:07:51 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:07:53 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-fdcc1557-6761-4bdb-adbc-f092c02927cc tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Acquiring lock "80eb182f-948b-42d3-999b-339c5d615a73" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:07:53 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-fdcc1557-6761-4bdb-adbc-f092c02927cc tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Lock "80eb182f-948b-42d3-999b-339c5d615a73" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:07:53 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-fdcc1557-6761-4bdb-adbc-f092c02927cc tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Acquiring lock "80eb182f-948b-42d3-999b-339c5d615a73-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:07:53 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-fdcc1557-6761-4bdb-adbc-f092c02927cc tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Lock "80eb182f-948b-42d3-999b-339c5d615a73-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:07:53 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-fdcc1557-6761-4bdb-adbc-f092c02927cc tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Lock "80eb182f-948b-42d3-999b-339c5d615a73-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:07:53 user nova-compute[71474]: INFO nova.compute.manager [None req-fdcc1557-6761-4bdb-adbc-f092c02927cc tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Terminating instance Apr 21 14:07:53 user nova-compute[71474]: DEBUG nova.compute.manager [None req-fdcc1557-6761-4bdb-adbc-f092c02927cc tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Start destroying the instance on the hypervisor. {{(pid=71474) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 21 14:07:53 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:07:53 user nova-compute[71474]: DEBUG nova.compute.manager [req-7bad4b8d-6a68-40af-8e81-13c26ec51d94 req-8c1d0761-a2b2-48a7-a003-86fe5ca015d7 service nova] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Received event network-vif-unplugged-def6080a-bf3f-4516-8140-08f463f69eb7 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:07:53 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-7bad4b8d-6a68-40af-8e81-13c26ec51d94 req-8c1d0761-a2b2-48a7-a003-86fe5ca015d7 service nova] Acquiring lock "80eb182f-948b-42d3-999b-339c5d615a73-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:07:53 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-7bad4b8d-6a68-40af-8e81-13c26ec51d94 req-8c1d0761-a2b2-48a7-a003-86fe5ca015d7 service nova] Lock "80eb182f-948b-42d3-999b-339c5d615a73-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:07:53 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-7bad4b8d-6a68-40af-8e81-13c26ec51d94 req-8c1d0761-a2b2-48a7-a003-86fe5ca015d7 service nova] Lock "80eb182f-948b-42d3-999b-339c5d615a73-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:07:53 user nova-compute[71474]: DEBUG nova.compute.manager [req-7bad4b8d-6a68-40af-8e81-13c26ec51d94 req-8c1d0761-a2b2-48a7-a003-86fe5ca015d7 service nova] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] No waiting events found dispatching network-vif-unplugged-def6080a-bf3f-4516-8140-08f463f69eb7 {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:07:53 user nova-compute[71474]: DEBUG nova.compute.manager [req-7bad4b8d-6a68-40af-8e81-13c26ec51d94 req-8c1d0761-a2b2-48a7-a003-86fe5ca015d7 service nova] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Received event network-vif-unplugged-def6080a-bf3f-4516-8140-08f463f69eb7 for instance with task_state deleting. {{(pid=71474) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 21 14:07:53 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:07:53 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:07:53 user nova-compute[71474]: INFO nova.virt.libvirt.driver [-] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Instance destroyed successfully. Apr 21 14:07:53 user nova-compute[71474]: DEBUG nova.objects.instance [None req-fdcc1557-6761-4bdb-adbc-f092c02927cc tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Lazy-loading 'resources' on Instance uuid 80eb182f-948b-42d3-999b-339c5d615a73 {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 14:07:54 user nova-compute[71474]: DEBUG nova.virt.libvirt.vif [None req-fdcc1557-6761-4bdb-adbc-f092c02927cc tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T14:03:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-718848885',display_name='tempest-ServerRescueNegativeTestJSON-server-718848885',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverrescuenegativetestjson-server-718848885',id=17,image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-21T14:05:04Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='432a123307454a44922597d6c9089447',ramdisk_id='',reservation_id='r-oiv28z2j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServerRescueNegativeTestJSON-193683719',owner_user_name='tempest-ServerRescueNegativeTestJSON-193683719-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-21T14:05:05Z,user_data=None,user_id='4df58f0cb48f4aa29df57f9c2f632782',uuid=80eb182f-948b-42d3-999b-339c5d615a73,vcpu_model=,vcpus=1,vm_mode=None,vm_state='rescued') vif={"id": "def6080a-bf3f-4516-8140-08f463f69eb7", "address": "fa:16:3e:ff:23:c8", "network": {"id": "6942adb6-1e24-4361-9a43-e8b692767b1f", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1408422178-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "432a123307454a44922597d6c9089447", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapdef6080a-bf", "ovs_interfaceid": "def6080a-bf3f-4516-8140-08f463f69eb7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 21 14:07:54 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-fdcc1557-6761-4bdb-adbc-f092c02927cc tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Converting VIF {"id": "def6080a-bf3f-4516-8140-08f463f69eb7", "address": "fa:16:3e:ff:23:c8", "network": {"id": "6942adb6-1e24-4361-9a43-e8b692767b1f", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1408422178-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "432a123307454a44922597d6c9089447", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapdef6080a-bf", "ovs_interfaceid": "def6080a-bf3f-4516-8140-08f463f69eb7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 14:07:54 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-fdcc1557-6761-4bdb-adbc-f092c02927cc tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ff:23:c8,bridge_name='br-int',has_traffic_filtering=True,id=def6080a-bf3f-4516-8140-08f463f69eb7,network=Network(6942adb6-1e24-4361-9a43-e8b692767b1f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdef6080a-bf') {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 14:07:54 user nova-compute[71474]: DEBUG os_vif [None req-fdcc1557-6761-4bdb-adbc-f092c02927cc tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ff:23:c8,bridge_name='br-int',has_traffic_filtering=True,id=def6080a-bf3f-4516-8140-08f463f69eb7,network=Network(6942adb6-1e24-4361-9a43-e8b692767b1f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdef6080a-bf') {{(pid=71474) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 21 14:07:54 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:07:54 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdef6080a-bf, bridge=br-int, if_exists=True) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 14:07:54 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:07:54 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 14:07:54 user nova-compute[71474]: INFO os_vif [None req-fdcc1557-6761-4bdb-adbc-f092c02927cc tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ff:23:c8,bridge_name='br-int',has_traffic_filtering=True,id=def6080a-bf3f-4516-8140-08f463f69eb7,network=Network(6942adb6-1e24-4361-9a43-e8b692767b1f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdef6080a-bf') Apr 21 14:07:54 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-fdcc1557-6761-4bdb-adbc-f092c02927cc tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Deleting instance files /opt/stack/data/nova/instances/80eb182f-948b-42d3-999b-339c5d615a73_del Apr 21 14:07:54 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-fdcc1557-6761-4bdb-adbc-f092c02927cc tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Deletion of /opt/stack/data/nova/instances/80eb182f-948b-42d3-999b-339c5d615a73_del complete Apr 21 14:07:54 user nova-compute[71474]: INFO nova.compute.manager [None req-fdcc1557-6761-4bdb-adbc-f092c02927cc tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Took 0.72 seconds to destroy the instance on the hypervisor. Apr 21 14:07:54 user nova-compute[71474]: DEBUG oslo.service.loopingcall [None req-fdcc1557-6761-4bdb-adbc-f092c02927cc tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71474) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 21 14:07:54 user nova-compute[71474]: DEBUG nova.compute.manager [-] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Deallocating network for instance {{(pid=71474) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 21 14:07:54 user nova-compute[71474]: DEBUG nova.network.neutron [-] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] deallocate_for_instance() {{(pid=71474) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 21 14:07:54 user nova-compute[71474]: DEBUG nova.network.neutron [-] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Updating instance_info_cache with network_info: [] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:07:54 user nova-compute[71474]: INFO nova.compute.manager [-] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Took 0.47 seconds to deallocate network for instance. Apr 21 14:07:54 user nova-compute[71474]: DEBUG nova.compute.manager [req-0a9b56ba-b4a7-4f89-9b41-9c28e49b54d6 req-e6c30c1f-64b3-4a71-967f-20e6980667b2 service nova] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Received event network-vif-deleted-def6080a-bf3f-4516-8140-08f463f69eb7 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:07:54 user nova-compute[71474]: INFO nova.compute.manager [req-0a9b56ba-b4a7-4f89-9b41-9c28e49b54d6 req-e6c30c1f-64b3-4a71-967f-20e6980667b2 service nova] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Neutron deleted interface def6080a-bf3f-4516-8140-08f463f69eb7; detaching it from the instance and deleting it from the info cache Apr 21 14:07:54 user nova-compute[71474]: DEBUG nova.network.neutron [req-0a9b56ba-b4a7-4f89-9b41-9c28e49b54d6 req-e6c30c1f-64b3-4a71-967f-20e6980667b2 service nova] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Updating instance_info_cache with network_info: [] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:07:54 user nova-compute[71474]: DEBUG nova.compute.manager [req-0a9b56ba-b4a7-4f89-9b41-9c28e49b54d6 req-e6c30c1f-64b3-4a71-967f-20e6980667b2 service nova] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Detach interface failed, port_id=def6080a-bf3f-4516-8140-08f463f69eb7, reason: Instance 80eb182f-948b-42d3-999b-339c5d615a73 could not be found. {{(pid=71474) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:10816}} Apr 21 14:07:54 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-fdcc1557-6761-4bdb-adbc-f092c02927cc tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:07:54 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-fdcc1557-6761-4bdb-adbc-f092c02927cc tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:07:54 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-fdcc1557-6761-4bdb-adbc-f092c02927cc tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Inventory has not changed in ProviderTree for provider: 4e62c1ab-67bb-43ed-8389-61deb50e98d7 {{(pid=71474) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 14:07:54 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-fdcc1557-6761-4bdb-adbc-f092c02927cc tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Inventory has not changed for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 14:07:54 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-fdcc1557-6761-4bdb-adbc-f092c02927cc tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.229s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:07:54 user nova-compute[71474]: INFO nova.scheduler.client.report [None req-fdcc1557-6761-4bdb-adbc-f092c02927cc tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Deleted allocations for instance 80eb182f-948b-42d3-999b-339c5d615a73 Apr 21 14:07:55 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-fdcc1557-6761-4bdb-adbc-f092c02927cc tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Lock "80eb182f-948b-42d3-999b-339c5d615a73" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.624s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:07:55 user nova-compute[71474]: DEBUG nova.compute.manager [req-8d93d224-88fd-4bf2-9894-e7f1e70bcb2d req-c3e244c1-8f1e-4112-acb6-9b13dae48274 service nova] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Received event network-vif-plugged-def6080a-bf3f-4516-8140-08f463f69eb7 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:07:55 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-8d93d224-88fd-4bf2-9894-e7f1e70bcb2d req-c3e244c1-8f1e-4112-acb6-9b13dae48274 service nova] Acquiring lock "80eb182f-948b-42d3-999b-339c5d615a73-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:07:55 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-8d93d224-88fd-4bf2-9894-e7f1e70bcb2d req-c3e244c1-8f1e-4112-acb6-9b13dae48274 service nova] Lock "80eb182f-948b-42d3-999b-339c5d615a73-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:07:55 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-8d93d224-88fd-4bf2-9894-e7f1e70bcb2d req-c3e244c1-8f1e-4112-acb6-9b13dae48274 service nova] Lock "80eb182f-948b-42d3-999b-339c5d615a73-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:07:55 user nova-compute[71474]: DEBUG nova.compute.manager [req-8d93d224-88fd-4bf2-9894-e7f1e70bcb2d req-c3e244c1-8f1e-4112-acb6-9b13dae48274 service nova] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] No waiting events found dispatching network-vif-plugged-def6080a-bf3f-4516-8140-08f463f69eb7 {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:07:55 user nova-compute[71474]: WARNING nova.compute.manager [req-8d93d224-88fd-4bf2-9894-e7f1e70bcb2d req-c3e244c1-8f1e-4112-acb6-9b13dae48274 service nova] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Received unexpected event network-vif-plugged-def6080a-bf3f-4516-8140-08f463f69eb7 for instance with vm_state deleted and task_state None. Apr 21 14:07:56 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:07:57 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:07:57 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:07:57 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:07:57 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:07:57 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:07:57 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Auditing locally available compute resources for user (node: user) {{(pid=71474) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 21 14:07:57 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/91696ea3-6e52-4506-ba4d-7f87f7b9f5b1/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:07:58 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/91696ea3-6e52-4506-ba4d-7f87f7b9f5b1/disk --force-share --output=json" returned: 0 in 0.140s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:07:58 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/91696ea3-6e52-4506-ba4d-7f87f7b9f5b1/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:07:58 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/91696ea3-6e52-4506-ba4d-7f87f7b9f5b1/disk --force-share --output=json" returned: 0 in 0.134s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:07:58 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/eb793e62-10c7-4bc3-834b-4a046bd33462/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:07:58 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/eb793e62-10c7-4bc3-834b-4a046bd33462/disk --force-share --output=json" returned: 0 in 0.146s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:07:58 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/eb793e62-10c7-4bc3-834b-4a046bd33462/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:07:58 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/eb793e62-10c7-4bc3-834b-4a046bd33462/disk --force-share --output=json" returned: 0 in 0.142s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:07:58 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a205a2a4-c0de-4c5c-abc4-7b034070e014/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:07:58 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a205a2a4-c0de-4c5c-abc4-7b034070e014/disk --force-share --output=json" returned: 0 in 0.131s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:07:58 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a205a2a4-c0de-4c5c-abc4-7b034070e014/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:07:58 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a205a2a4-c0de-4c5c-abc4-7b034070e014/disk --force-share --output=json" returned: 0 in 0.138s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:07:58 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b5e2e065-1b7d-4cbf-b31a-923ae2f92fff/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:07:58 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b5e2e065-1b7d-4cbf-b31a-923ae2f92fff/disk --force-share --output=json" returned: 0 in 0.131s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:07:58 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b5e2e065-1b7d-4cbf-b31a-923ae2f92fff/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:07:59 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:07:59 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b5e2e065-1b7d-4cbf-b31a-923ae2f92fff/disk --force-share --output=json" returned: 0 in 0.131s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:07:59 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 14:07:59 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 14:07:59 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Hypervisor/Node resource view: name=user free_ram=8552MB free_disk=26.04541778564453GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71474) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 21 14:07:59 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:07:59 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:07:59 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Instance a205a2a4-c0de-4c5c-abc4-7b034070e014 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71474) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 14:07:59 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Instance b5e2e065-1b7d-4cbf-b31a-923ae2f92fff actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71474) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 14:07:59 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Instance 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71474) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 14:07:59 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Instance eb793e62-10c7-4bc3-834b-4a046bd33462 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71474) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 14:07:59 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Total usable vcpus: 12, total allocated vcpus: 4 {{(pid=71474) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 21 14:07:59 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Final resource view: name=user phys_ram=16023MB used_ram=1024MB phys_disk=40GB used_disk=4GB total_vcpus=12 used_vcpus=4 pci_stats=[] {{(pid=71474) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 21 14:07:59 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Inventory has not changed in ProviderTree for provider: 4e62c1ab-67bb-43ed-8389-61deb50e98d7 {{(pid=71474) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 14:07:59 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Inventory has not changed for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 14:07:59 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Compute_service record updated for user:user {{(pid=71474) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 21 14:07:59 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.322s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:08:00 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:08:00 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:08:00 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Starting heal instance info cache {{(pid=71474) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 21 14:08:00 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquiring lock "refresh_cache-a205a2a4-c0de-4c5c-abc4-7b034070e014" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 14:08:00 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquired lock "refresh_cache-a205a2a4-c0de-4c5c-abc4-7b034070e014" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 14:08:00 user nova-compute[71474]: DEBUG nova.network.neutron [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] [instance: a205a2a4-c0de-4c5c-abc4-7b034070e014] Forcefully refreshing network info cache for instance {{(pid=71474) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 21 14:08:01 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:08:01 user nova-compute[71474]: DEBUG nova.network.neutron [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] [instance: a205a2a4-c0de-4c5c-abc4-7b034070e014] Updating instance_info_cache with network_info: [{"id": "10363ff5-34d7-4af3-bd72-c7cb78d665c9", "address": "fa:16:3e:ae:cb:ca", "network": {"id": "6942adb6-1e24-4361-9a43-e8b692767b1f", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1408422178-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "432a123307454a44922597d6c9089447", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap10363ff5-34", "ovs_interfaceid": "10363ff5-34d7-4af3-bd72-c7cb78d665c9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:08:01 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Releasing lock "refresh_cache-a205a2a4-c0de-4c5c-abc4-7b034070e014" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 14:08:01 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] [instance: a205a2a4-c0de-4c5c-abc4-7b034070e014] Updated the network info_cache for instance {{(pid=71474) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 21 14:08:01 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:08:01 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:08:02 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:08:04 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:08:04 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:08:04 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:08:04 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71474) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 21 14:08:05 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:08:08 user nova-compute[71474]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71474) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 14:08:08 user nova-compute[71474]: INFO nova.compute.manager [-] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] VM Stopped (Lifecycle Event) Apr 21 14:08:09 user nova-compute[71474]: DEBUG nova.compute.manager [None req-49dc86b8-9cf2-489a-a466-0645b1bf2fba None None] [instance: 80eb182f-948b-42d3-999b-339c5d615a73] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 14:08:09 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:08:11 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:08:14 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:08:18 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:08:19 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:08:22 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:08:24 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:08:29 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:08:34 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:08:35 user nova-compute[71474]: DEBUG nova.compute.manager [req-2b7d65fa-67d8-42d8-80ac-c06009afdf48 req-293defd7-c5f1-4bb0-a35e-98ba53d84ec8 service nova] [instance: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1] Received event network-changed-0b9909b1-cbc2-4a32-9744-599b789730dc {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:08:35 user nova-compute[71474]: DEBUG nova.compute.manager [req-2b7d65fa-67d8-42d8-80ac-c06009afdf48 req-293defd7-c5f1-4bb0-a35e-98ba53d84ec8 service nova] [instance: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1] Refreshing instance network info cache due to event network-changed-0b9909b1-cbc2-4a32-9744-599b789730dc. {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 21 14:08:35 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-2b7d65fa-67d8-42d8-80ac-c06009afdf48 req-293defd7-c5f1-4bb0-a35e-98ba53d84ec8 service nova] Acquiring lock "refresh_cache-91696ea3-6e52-4506-ba4d-7f87f7b9f5b1" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 14:08:35 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-2b7d65fa-67d8-42d8-80ac-c06009afdf48 req-293defd7-c5f1-4bb0-a35e-98ba53d84ec8 service nova] Acquired lock "refresh_cache-91696ea3-6e52-4506-ba4d-7f87f7b9f5b1" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 14:08:35 user nova-compute[71474]: DEBUG nova.network.neutron [req-2b7d65fa-67d8-42d8-80ac-c06009afdf48 req-293defd7-c5f1-4bb0-a35e-98ba53d84ec8 service nova] [instance: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1] Refreshing network info cache for port 0b9909b1-cbc2-4a32-9744-599b789730dc {{(pid=71474) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 21 14:08:36 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:08:36 user nova-compute[71474]: DEBUG nova.network.neutron [req-2b7d65fa-67d8-42d8-80ac-c06009afdf48 req-293defd7-c5f1-4bb0-a35e-98ba53d84ec8 service nova] [instance: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1] Updated VIF entry in instance network info cache for port 0b9909b1-cbc2-4a32-9744-599b789730dc. {{(pid=71474) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 21 14:08:36 user nova-compute[71474]: DEBUG nova.network.neutron [req-2b7d65fa-67d8-42d8-80ac-c06009afdf48 req-293defd7-c5f1-4bb0-a35e-98ba53d84ec8 service nova] [instance: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1] Updating instance_info_cache with network_info: [{"id": "0b9909b1-cbc2-4a32-9744-599b789730dc", "address": "fa:16:3e:47:d6:29", "network": {"id": "31b07b9f-0a0f-426a-97d6-12b23e611818", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1809206062-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f0ccc2c950364fcbb0f2b1cc937f6a82", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b9909b1-cb", "ovs_interfaceid": "0b9909b1-cbc2-4a32-9744-599b789730dc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:08:36 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-2b7d65fa-67d8-42d8-80ac-c06009afdf48 req-293defd7-c5f1-4bb0-a35e-98ba53d84ec8 service nova] Releasing lock "refresh_cache-91696ea3-6e52-4506-ba4d-7f87f7b9f5b1" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 14:08:37 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-4e6a6b0d-bd5f-4c8c-8387-d096beab8f20 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Acquiring lock "91696ea3-6e52-4506-ba4d-7f87f7b9f5b1" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:08:37 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-4e6a6b0d-bd5f-4c8c-8387-d096beab8f20 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Lock "91696ea3-6e52-4506-ba4d-7f87f7b9f5b1" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:08:37 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-4e6a6b0d-bd5f-4c8c-8387-d096beab8f20 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Acquiring lock "91696ea3-6e52-4506-ba4d-7f87f7b9f5b1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:08:37 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-4e6a6b0d-bd5f-4c8c-8387-d096beab8f20 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Lock "91696ea3-6e52-4506-ba4d-7f87f7b9f5b1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:08:37 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-4e6a6b0d-bd5f-4c8c-8387-d096beab8f20 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Lock "91696ea3-6e52-4506-ba4d-7f87f7b9f5b1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:08:37 user nova-compute[71474]: INFO nova.compute.manager [None req-4e6a6b0d-bd5f-4c8c-8387-d096beab8f20 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1] Terminating instance Apr 21 14:08:37 user nova-compute[71474]: DEBUG nova.compute.manager [None req-4e6a6b0d-bd5f-4c8c-8387-d096beab8f20 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1] Start destroying the instance on the hypervisor. {{(pid=71474) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 21 14:08:37 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:08:37 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:08:37 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:08:37 user nova-compute[71474]: DEBUG nova.compute.manager [req-f2267aa3-5b39-43e0-872c-20ae56a7a345 req-5815018d-a7d6-4a45-957c-e46f1e8c1bca service nova] [instance: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1] Received event network-vif-unplugged-0b9909b1-cbc2-4a32-9744-599b789730dc {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:08:37 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-f2267aa3-5b39-43e0-872c-20ae56a7a345 req-5815018d-a7d6-4a45-957c-e46f1e8c1bca service nova] Acquiring lock "91696ea3-6e52-4506-ba4d-7f87f7b9f5b1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:08:37 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-f2267aa3-5b39-43e0-872c-20ae56a7a345 req-5815018d-a7d6-4a45-957c-e46f1e8c1bca service nova] Lock "91696ea3-6e52-4506-ba4d-7f87f7b9f5b1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.003s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:08:37 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-f2267aa3-5b39-43e0-872c-20ae56a7a345 req-5815018d-a7d6-4a45-957c-e46f1e8c1bca service nova] Lock "91696ea3-6e52-4506-ba4d-7f87f7b9f5b1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:08:37 user nova-compute[71474]: DEBUG nova.compute.manager [req-f2267aa3-5b39-43e0-872c-20ae56a7a345 req-5815018d-a7d6-4a45-957c-e46f1e8c1bca service nova] [instance: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1] No waiting events found dispatching network-vif-unplugged-0b9909b1-cbc2-4a32-9744-599b789730dc {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:08:37 user nova-compute[71474]: DEBUG nova.compute.manager [req-f2267aa3-5b39-43e0-872c-20ae56a7a345 req-5815018d-a7d6-4a45-957c-e46f1e8c1bca service nova] [instance: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1] Received event network-vif-unplugged-0b9909b1-cbc2-4a32-9744-599b789730dc for instance with task_state deleting. {{(pid=71474) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 21 14:08:37 user nova-compute[71474]: INFO nova.virt.libvirt.driver [-] [instance: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1] Instance destroyed successfully. Apr 21 14:08:37 user nova-compute[71474]: DEBUG nova.objects.instance [None req-4e6a6b0d-bd5f-4c8c-8387-d096beab8f20 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Lazy-loading 'resources' on Instance uuid 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1 {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 14:08:37 user nova-compute[71474]: DEBUG nova.virt.libvirt.vif [None req-4e6a6b0d-bd5f-4c8c-8387-d096beab8f20 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T14:06:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-1414365371',display_name='tempest-AttachVolumeNegativeTest-server-1414365371',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-1414365371',id=22,image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHVTbSwwxECYrofWLzM3xM2athtWkhHO4PmnRUvV4IeHkFrsz3GVwS5pKQyGAUvFsHgrVRcBmNgHjdWVkJa8/B3vkVSYjn5BwhRB1DM72Kz9Nxe+lrLjXM+s4ubHvbbIIg==',key_name='tempest-keypair-571092211',keypairs=,launch_index=0,launched_at=2023-04-21T14:06:51Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='f0ccc2c950364fcbb0f2b1cc937f6a82',ramdisk_id='',reservation_id='r-ji6g5x0c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-AttachVolumeNegativeTest-166063504',owner_user_name='tempest-AttachVolumeNegativeTest-166063504-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-21T14:06:51Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='ab1d2ed7df2f4a9bbf14da7e2c5fece2',uuid=91696ea3-6e52-4506-ba4d-7f87f7b9f5b1,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0b9909b1-cbc2-4a32-9744-599b789730dc", "address": "fa:16:3e:47:d6:29", "network": {"id": "31b07b9f-0a0f-426a-97d6-12b23e611818", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1809206062-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f0ccc2c950364fcbb0f2b1cc937f6a82", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b9909b1-cb", "ovs_interfaceid": "0b9909b1-cbc2-4a32-9744-599b789730dc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 21 14:08:37 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-4e6a6b0d-bd5f-4c8c-8387-d096beab8f20 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Converting VIF {"id": "0b9909b1-cbc2-4a32-9744-599b789730dc", "address": "fa:16:3e:47:d6:29", "network": {"id": "31b07b9f-0a0f-426a-97d6-12b23e611818", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1809206062-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f0ccc2c950364fcbb0f2b1cc937f6a82", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b9909b1-cb", "ovs_interfaceid": "0b9909b1-cbc2-4a32-9744-599b789730dc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 14:08:37 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-4e6a6b0d-bd5f-4c8c-8387-d096beab8f20 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:47:d6:29,bridge_name='br-int',has_traffic_filtering=True,id=0b9909b1-cbc2-4a32-9744-599b789730dc,network=Network(31b07b9f-0a0f-426a-97d6-12b23e611818),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b9909b1-cb') {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 14:08:37 user nova-compute[71474]: DEBUG os_vif [None req-4e6a6b0d-bd5f-4c8c-8387-d096beab8f20 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:47:d6:29,bridge_name='br-int',has_traffic_filtering=True,id=0b9909b1-cbc2-4a32-9744-599b789730dc,network=Network(31b07b9f-0a0f-426a-97d6-12b23e611818),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b9909b1-cb') {{(pid=71474) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 21 14:08:37 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:08:37 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0b9909b1-cb, bridge=br-int, if_exists=True) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 14:08:37 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:08:37 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 14:08:37 user nova-compute[71474]: INFO os_vif [None req-4e6a6b0d-bd5f-4c8c-8387-d096beab8f20 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:47:d6:29,bridge_name='br-int',has_traffic_filtering=True,id=0b9909b1-cbc2-4a32-9744-599b789730dc,network=Network(31b07b9f-0a0f-426a-97d6-12b23e611818),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b9909b1-cb') Apr 21 14:08:37 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-4e6a6b0d-bd5f-4c8c-8387-d096beab8f20 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1] Deleting instance files /opt/stack/data/nova/instances/91696ea3-6e52-4506-ba4d-7f87f7b9f5b1_del Apr 21 14:08:37 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-4e6a6b0d-bd5f-4c8c-8387-d096beab8f20 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1] Deletion of /opt/stack/data/nova/instances/91696ea3-6e52-4506-ba4d-7f87f7b9f5b1_del complete Apr 21 14:08:37 user nova-compute[71474]: INFO nova.compute.manager [None req-4e6a6b0d-bd5f-4c8c-8387-d096beab8f20 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1] Took 0.85 seconds to destroy the instance on the hypervisor. Apr 21 14:08:37 user nova-compute[71474]: DEBUG oslo.service.loopingcall [None req-4e6a6b0d-bd5f-4c8c-8387-d096beab8f20 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71474) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 21 14:08:37 user nova-compute[71474]: DEBUG nova.compute.manager [-] [instance: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1] Deallocating network for instance {{(pid=71474) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 21 14:08:37 user nova-compute[71474]: DEBUG nova.network.neutron [-] [instance: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1] deallocate_for_instance() {{(pid=71474) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 21 14:08:39 user nova-compute[71474]: DEBUG nova.network.neutron [-] [instance: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1] Updating instance_info_cache with network_info: [] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:08:39 user nova-compute[71474]: INFO nova.compute.manager [-] [instance: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1] Took 1.27 seconds to deallocate network for instance. Apr 21 14:08:39 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-4e6a6b0d-bd5f-4c8c-8387-d096beab8f20 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:08:39 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-4e6a6b0d-bd5f-4c8c-8387-d096beab8f20 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:08:39 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-4e6a6b0d-bd5f-4c8c-8387-d096beab8f20 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Inventory has not changed in ProviderTree for provider: 4e62c1ab-67bb-43ed-8389-61deb50e98d7 {{(pid=71474) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 14:08:39 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-4e6a6b0d-bd5f-4c8c-8387-d096beab8f20 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Inventory has not changed for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 14:08:39 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-4e6a6b0d-bd5f-4c8c-8387-d096beab8f20 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.218s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:08:39 user nova-compute[71474]: INFO nova.scheduler.client.report [None req-4e6a6b0d-bd5f-4c8c-8387-d096beab8f20 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Deleted allocations for instance 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1 Apr 21 14:08:39 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-4e6a6b0d-bd5f-4c8c-8387-d096beab8f20 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Lock "91696ea3-6e52-4506-ba4d-7f87f7b9f5b1" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.522s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:08:39 user nova-compute[71474]: DEBUG nova.compute.manager [req-5c2746cd-a667-41de-9f42-cd1d9b593f37 req-a94d3b04-bab9-49d8-ba14-7cb25bb75651 service nova] [instance: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1] Received event network-vif-plugged-0b9909b1-cbc2-4a32-9744-599b789730dc {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:08:39 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-5c2746cd-a667-41de-9f42-cd1d9b593f37 req-a94d3b04-bab9-49d8-ba14-7cb25bb75651 service nova] Acquiring lock "91696ea3-6e52-4506-ba4d-7f87f7b9f5b1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:08:39 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-5c2746cd-a667-41de-9f42-cd1d9b593f37 req-a94d3b04-bab9-49d8-ba14-7cb25bb75651 service nova] Lock "91696ea3-6e52-4506-ba4d-7f87f7b9f5b1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:08:39 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-5c2746cd-a667-41de-9f42-cd1d9b593f37 req-a94d3b04-bab9-49d8-ba14-7cb25bb75651 service nova] Lock "91696ea3-6e52-4506-ba4d-7f87f7b9f5b1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:08:39 user nova-compute[71474]: DEBUG nova.compute.manager [req-5c2746cd-a667-41de-9f42-cd1d9b593f37 req-a94d3b04-bab9-49d8-ba14-7cb25bb75651 service nova] [instance: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1] No waiting events found dispatching network-vif-plugged-0b9909b1-cbc2-4a32-9744-599b789730dc {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:08:39 user nova-compute[71474]: WARNING nova.compute.manager [req-5c2746cd-a667-41de-9f42-cd1d9b593f37 req-a94d3b04-bab9-49d8-ba14-7cb25bb75651 service nova] [instance: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1] Received unexpected event network-vif-plugged-0b9909b1-cbc2-4a32-9744-599b789730dc for instance with vm_state deleted and task_state None. Apr 21 14:08:39 user nova-compute[71474]: DEBUG nova.compute.manager [req-5c2746cd-a667-41de-9f42-cd1d9b593f37 req-a94d3b04-bab9-49d8-ba14-7cb25bb75651 service nova] [instance: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1] Received event network-vif-deleted-0b9909b1-cbc2-4a32-9744-599b789730dc {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:08:41 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:08:42 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:08:43 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-8924b070-ed80-4e2a-8ad3-f8d23e021274 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Acquiring lock "a205a2a4-c0de-4c5c-abc4-7b034070e014" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:08:43 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-8924b070-ed80-4e2a-8ad3-f8d23e021274 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Lock "a205a2a4-c0de-4c5c-abc4-7b034070e014" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:08:43 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-8924b070-ed80-4e2a-8ad3-f8d23e021274 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Acquiring lock "a205a2a4-c0de-4c5c-abc4-7b034070e014-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:08:43 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-8924b070-ed80-4e2a-8ad3-f8d23e021274 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Lock "a205a2a4-c0de-4c5c-abc4-7b034070e014-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:08:43 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-8924b070-ed80-4e2a-8ad3-f8d23e021274 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Lock "a205a2a4-c0de-4c5c-abc4-7b034070e014-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:08:43 user nova-compute[71474]: INFO nova.compute.manager [None req-8924b070-ed80-4e2a-8ad3-f8d23e021274 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: a205a2a4-c0de-4c5c-abc4-7b034070e014] Terminating instance Apr 21 14:08:43 user nova-compute[71474]: DEBUG nova.compute.manager [None req-8924b070-ed80-4e2a-8ad3-f8d23e021274 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: a205a2a4-c0de-4c5c-abc4-7b034070e014] Start destroying the instance on the hypervisor. {{(pid=71474) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 21 14:08:44 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:08:44 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:08:44 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:08:44 user nova-compute[71474]: DEBUG nova.compute.manager [req-2ec1a305-8af9-40cc-95e3-4f0984104e2c req-23f8b0eb-e97d-438c-a43d-aeba06693c1a service nova] [instance: a205a2a4-c0de-4c5c-abc4-7b034070e014] Received event network-vif-unplugged-10363ff5-34d7-4af3-bd72-c7cb78d665c9 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:08:44 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-2ec1a305-8af9-40cc-95e3-4f0984104e2c req-23f8b0eb-e97d-438c-a43d-aeba06693c1a service nova] Acquiring lock "a205a2a4-c0de-4c5c-abc4-7b034070e014-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:08:44 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-2ec1a305-8af9-40cc-95e3-4f0984104e2c req-23f8b0eb-e97d-438c-a43d-aeba06693c1a service nova] Lock "a205a2a4-c0de-4c5c-abc4-7b034070e014-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:08:44 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-2ec1a305-8af9-40cc-95e3-4f0984104e2c req-23f8b0eb-e97d-438c-a43d-aeba06693c1a service nova] Lock "a205a2a4-c0de-4c5c-abc4-7b034070e014-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:08:44 user nova-compute[71474]: DEBUG nova.compute.manager [req-2ec1a305-8af9-40cc-95e3-4f0984104e2c req-23f8b0eb-e97d-438c-a43d-aeba06693c1a service nova] [instance: a205a2a4-c0de-4c5c-abc4-7b034070e014] No waiting events found dispatching network-vif-unplugged-10363ff5-34d7-4af3-bd72-c7cb78d665c9 {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:08:44 user nova-compute[71474]: DEBUG nova.compute.manager [req-2ec1a305-8af9-40cc-95e3-4f0984104e2c req-23f8b0eb-e97d-438c-a43d-aeba06693c1a service nova] [instance: a205a2a4-c0de-4c5c-abc4-7b034070e014] Received event network-vif-unplugged-10363ff5-34d7-4af3-bd72-c7cb78d665c9 for instance with task_state deleting. {{(pid=71474) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 21 14:08:44 user nova-compute[71474]: INFO nova.virt.libvirt.driver [-] [instance: a205a2a4-c0de-4c5c-abc4-7b034070e014] Instance destroyed successfully. Apr 21 14:08:44 user nova-compute[71474]: DEBUG nova.objects.instance [None req-8924b070-ed80-4e2a-8ad3-f8d23e021274 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Lazy-loading 'resources' on Instance uuid a205a2a4-c0de-4c5c-abc4-7b034070e014 {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 14:08:44 user nova-compute[71474]: DEBUG nova.virt.libvirt.vif [None req-8924b070-ed80-4e2a-8ad3-f8d23e021274 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T14:03:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-1976108837',display_name='tempest-ServerRescueNegativeTestJSON-server-1976108837',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverrescuenegativetestjson-server-1976108837',id=16,image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-21T14:03:16Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='432a123307454a44922597d6c9089447',ramdisk_id='',reservation_id='r-bqqswuy3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServerRescueNegativeTestJSON-193683719',owner_user_name='tempest-ServerRescueNegativeTestJSON-193683719-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-21T14:03:16Z,user_data=None,user_id='4df58f0cb48f4aa29df57f9c2f632782',uuid=a205a2a4-c0de-4c5c-abc4-7b034070e014,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "10363ff5-34d7-4af3-bd72-c7cb78d665c9", "address": "fa:16:3e:ae:cb:ca", "network": {"id": "6942adb6-1e24-4361-9a43-e8b692767b1f", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1408422178-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "432a123307454a44922597d6c9089447", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap10363ff5-34", "ovs_interfaceid": "10363ff5-34d7-4af3-bd72-c7cb78d665c9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 21 14:08:44 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-8924b070-ed80-4e2a-8ad3-f8d23e021274 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Converting VIF {"id": "10363ff5-34d7-4af3-bd72-c7cb78d665c9", "address": "fa:16:3e:ae:cb:ca", "network": {"id": "6942adb6-1e24-4361-9a43-e8b692767b1f", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1408422178-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "432a123307454a44922597d6c9089447", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap10363ff5-34", "ovs_interfaceid": "10363ff5-34d7-4af3-bd72-c7cb78d665c9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 14:08:44 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-8924b070-ed80-4e2a-8ad3-f8d23e021274 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ae:cb:ca,bridge_name='br-int',has_traffic_filtering=True,id=10363ff5-34d7-4af3-bd72-c7cb78d665c9,network=Network(6942adb6-1e24-4361-9a43-e8b692767b1f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap10363ff5-34') {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 14:08:44 user nova-compute[71474]: DEBUG os_vif [None req-8924b070-ed80-4e2a-8ad3-f8d23e021274 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ae:cb:ca,bridge_name='br-int',has_traffic_filtering=True,id=10363ff5-34d7-4af3-bd72-c7cb78d665c9,network=Network(6942adb6-1e24-4361-9a43-e8b692767b1f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap10363ff5-34') {{(pid=71474) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 21 14:08:44 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:08:44 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap10363ff5-34, bridge=br-int, if_exists=True) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 14:08:44 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:08:44 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 14:08:44 user nova-compute[71474]: INFO os_vif [None req-8924b070-ed80-4e2a-8ad3-f8d23e021274 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ae:cb:ca,bridge_name='br-int',has_traffic_filtering=True,id=10363ff5-34d7-4af3-bd72-c7cb78d665c9,network=Network(6942adb6-1e24-4361-9a43-e8b692767b1f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap10363ff5-34') Apr 21 14:08:44 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-8924b070-ed80-4e2a-8ad3-f8d23e021274 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: a205a2a4-c0de-4c5c-abc4-7b034070e014] Deleting instance files /opt/stack/data/nova/instances/a205a2a4-c0de-4c5c-abc4-7b034070e014_del Apr 21 14:08:44 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-8924b070-ed80-4e2a-8ad3-f8d23e021274 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: a205a2a4-c0de-4c5c-abc4-7b034070e014] Deletion of /opt/stack/data/nova/instances/a205a2a4-c0de-4c5c-abc4-7b034070e014_del complete Apr 21 14:08:44 user nova-compute[71474]: INFO nova.compute.manager [None req-8924b070-ed80-4e2a-8ad3-f8d23e021274 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] [instance: a205a2a4-c0de-4c5c-abc4-7b034070e014] Took 0.64 seconds to destroy the instance on the hypervisor. Apr 21 14:08:44 user nova-compute[71474]: DEBUG oslo.service.loopingcall [None req-8924b070-ed80-4e2a-8ad3-f8d23e021274 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71474) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 21 14:08:44 user nova-compute[71474]: DEBUG nova.compute.manager [-] [instance: a205a2a4-c0de-4c5c-abc4-7b034070e014] Deallocating network for instance {{(pid=71474) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 21 14:08:44 user nova-compute[71474]: DEBUG nova.network.neutron [-] [instance: a205a2a4-c0de-4c5c-abc4-7b034070e014] deallocate_for_instance() {{(pid=71474) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 21 14:08:45 user nova-compute[71474]: DEBUG nova.network.neutron [-] [instance: a205a2a4-c0de-4c5c-abc4-7b034070e014] Updating instance_info_cache with network_info: [] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:08:45 user nova-compute[71474]: INFO nova.compute.manager [-] [instance: a205a2a4-c0de-4c5c-abc4-7b034070e014] Took 0.48 seconds to deallocate network for instance. Apr 21 14:08:45 user nova-compute[71474]: DEBUG nova.compute.manager [req-7f4d7511-269e-463d-a5cf-b7dc8f5befa8 req-b755f04b-d269-496d-a306-dc524e621021 service nova] [instance: a205a2a4-c0de-4c5c-abc4-7b034070e014] Received event network-vif-deleted-10363ff5-34d7-4af3-bd72-c7cb78d665c9 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:08:45 user nova-compute[71474]: INFO nova.compute.manager [req-7f4d7511-269e-463d-a5cf-b7dc8f5befa8 req-b755f04b-d269-496d-a306-dc524e621021 service nova] [instance: a205a2a4-c0de-4c5c-abc4-7b034070e014] Neutron deleted interface 10363ff5-34d7-4af3-bd72-c7cb78d665c9; detaching it from the instance and deleting it from the info cache Apr 21 14:08:45 user nova-compute[71474]: DEBUG nova.network.neutron [req-7f4d7511-269e-463d-a5cf-b7dc8f5befa8 req-b755f04b-d269-496d-a306-dc524e621021 service nova] [instance: a205a2a4-c0de-4c5c-abc4-7b034070e014] Updating instance_info_cache with network_info: [] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:08:45 user nova-compute[71474]: DEBUG nova.compute.manager [req-7f4d7511-269e-463d-a5cf-b7dc8f5befa8 req-b755f04b-d269-496d-a306-dc524e621021 service nova] [instance: a205a2a4-c0de-4c5c-abc4-7b034070e014] Detach interface failed, port_id=10363ff5-34d7-4af3-bd72-c7cb78d665c9, reason: Instance a205a2a4-c0de-4c5c-abc4-7b034070e014 could not be found. {{(pid=71474) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:10816}} Apr 21 14:08:45 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-8924b070-ed80-4e2a-8ad3-f8d23e021274 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:08:45 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-8924b070-ed80-4e2a-8ad3-f8d23e021274 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:08:45 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-8924b070-ed80-4e2a-8ad3-f8d23e021274 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Inventory has not changed in ProviderTree for provider: 4e62c1ab-67bb-43ed-8389-61deb50e98d7 {{(pid=71474) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 14:08:45 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-8924b070-ed80-4e2a-8ad3-f8d23e021274 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Inventory has not changed for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 14:08:45 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-8924b070-ed80-4e2a-8ad3-f8d23e021274 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.190s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:08:45 user nova-compute[71474]: INFO nova.scheduler.client.report [None req-8924b070-ed80-4e2a-8ad3-f8d23e021274 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Deleted allocations for instance a205a2a4-c0de-4c5c-abc4-7b034070e014 Apr 21 14:08:45 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-8924b070-ed80-4e2a-8ad3-f8d23e021274 tempest-ServerRescueNegativeTestJSON-193683719 tempest-ServerRescueNegativeTestJSON-193683719-project-member] Lock "a205a2a4-c0de-4c5c-abc4-7b034070e014" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.589s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:08:46 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:08:46 user nova-compute[71474]: DEBUG nova.compute.manager [req-dc6070ee-fcf7-4df7-bf15-9857a5300c17 req-4e5c8397-f450-4af6-83a9-8236dc0549e4 service nova] [instance: a205a2a4-c0de-4c5c-abc4-7b034070e014] Received event network-vif-plugged-10363ff5-34d7-4af3-bd72-c7cb78d665c9 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:08:46 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-dc6070ee-fcf7-4df7-bf15-9857a5300c17 req-4e5c8397-f450-4af6-83a9-8236dc0549e4 service nova] Acquiring lock "a205a2a4-c0de-4c5c-abc4-7b034070e014-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:08:46 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-dc6070ee-fcf7-4df7-bf15-9857a5300c17 req-4e5c8397-f450-4af6-83a9-8236dc0549e4 service nova] Lock "a205a2a4-c0de-4c5c-abc4-7b034070e014-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:08:46 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-dc6070ee-fcf7-4df7-bf15-9857a5300c17 req-4e5c8397-f450-4af6-83a9-8236dc0549e4 service nova] Lock "a205a2a4-c0de-4c5c-abc4-7b034070e014-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:08:46 user nova-compute[71474]: DEBUG nova.compute.manager [req-dc6070ee-fcf7-4df7-bf15-9857a5300c17 req-4e5c8397-f450-4af6-83a9-8236dc0549e4 service nova] [instance: a205a2a4-c0de-4c5c-abc4-7b034070e014] No waiting events found dispatching network-vif-plugged-10363ff5-34d7-4af3-bd72-c7cb78d665c9 {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:08:46 user nova-compute[71474]: WARNING nova.compute.manager [req-dc6070ee-fcf7-4df7-bf15-9857a5300c17 req-4e5c8397-f450-4af6-83a9-8236dc0549e4 service nova] [instance: a205a2a4-c0de-4c5c-abc4-7b034070e014] Received unexpected event network-vif-plugged-10363ff5-34d7-4af3-bd72-c7cb78d665c9 for instance with vm_state deleted and task_state None. Apr 21 14:08:49 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:08:52 user nova-compute[71474]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71474) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 14:08:52 user nova-compute[71474]: INFO nova.compute.manager [-] [instance: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1] VM Stopped (Lifecycle Event) Apr 21 14:08:52 user nova-compute[71474]: DEBUG nova.compute.manager [None req-df225aa0-07a5-46a1-bc96-5610436ab5f2 None None] [instance: 91696ea3-6e52-4506-ba4d-7f87f7b9f5b1] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 14:08:54 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:08:55 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:08:58 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:08:58 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:08:58 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:08:58 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:08:58 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:08:58 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Auditing locally available compute resources for user (node: user) {{(pid=71474) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 21 14:08:58 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b5e2e065-1b7d-4cbf-b31a-923ae2f92fff/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:08:59 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b5e2e065-1b7d-4cbf-b31a-923ae2f92fff/disk --force-share --output=json" returned: 0 in 0.141s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:08:59 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b5e2e065-1b7d-4cbf-b31a-923ae2f92fff/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:08:59 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b5e2e065-1b7d-4cbf-b31a-923ae2f92fff/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:08:59 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/eb793e62-10c7-4bc3-834b-4a046bd33462/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:08:59 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/eb793e62-10c7-4bc3-834b-4a046bd33462/disk --force-share --output=json" returned: 0 in 0.137s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:08:59 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/eb793e62-10c7-4bc3-834b-4a046bd33462/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:08:59 user nova-compute[71474]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71474) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 14:08:59 user nova-compute[71474]: INFO nova.compute.manager [-] [instance: a205a2a4-c0de-4c5c-abc4-7b034070e014] VM Stopped (Lifecycle Event) Apr 21 14:08:59 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/eb793e62-10c7-4bc3-834b-4a046bd33462/disk --force-share --output=json" returned: 0 in 0.148s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:08:59 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:08:59 user nova-compute[71474]: DEBUG nova.compute.manager [None req-c6fb83ed-e2b3-4228-9055-63584c2d8ad1 None None] [instance: a205a2a4-c0de-4c5c-abc4-7b034070e014] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 14:08:59 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 14:08:59 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 14:08:59 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Hypervisor/Node resource view: name=user free_ram=8851MB free_disk=26.073467254638672GB free_vcpus=10 pci_devices=[{"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71474) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 21 14:08:59 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:08:59 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:09:00 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Instance b5e2e065-1b7d-4cbf-b31a-923ae2f92fff actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71474) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 14:09:00 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Instance eb793e62-10c7-4bc3-834b-4a046bd33462 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71474) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 14:09:00 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Total usable vcpus: 12, total allocated vcpus: 2 {{(pid=71474) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 21 14:09:00 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Final resource view: name=user phys_ram=16023MB used_ram=768MB phys_disk=40GB used_disk=2GB total_vcpus=12 used_vcpus=2 pci_stats=[] {{(pid=71474) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 21 14:09:00 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Refreshing inventories for resource provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 {{(pid=71474) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} Apr 21 14:09:00 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Updating ProviderTree inventory for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 from _refresh_and_get_inventory using data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} Apr 21 14:09:00 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Updating inventory in ProviderTree for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 with inventory: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} Apr 21 14:09:00 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Refreshing aggregate associations for resource provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7, aggregates: None {{(pid=71474) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} Apr 21 14:09:00 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Refreshing trait associations for resource provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7, traits: COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_GRAPHICS_MODEL_VMVGA,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE2,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_MMX,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_QXL,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSSE3,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE42,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_USB,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_BOCHS {{(pid=71474) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} Apr 21 14:09:00 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Inventory has not changed in ProviderTree for provider: 4e62c1ab-67bb-43ed-8389-61deb50e98d7 {{(pid=71474) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 14:09:00 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Inventory has not changed for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 14:09:00 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Compute_service record updated for user:user {{(pid=71474) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 21 14:09:00 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.478s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:09:01 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:09:01 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Starting heal instance info cache {{(pid=71474) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 21 14:09:01 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquiring lock "refresh_cache-b5e2e065-1b7d-4cbf-b31a-923ae2f92fff" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 14:09:01 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquired lock "refresh_cache-b5e2e065-1b7d-4cbf-b31a-923ae2f92fff" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 14:09:01 user nova-compute[71474]: DEBUG nova.network.neutron [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Forcefully refreshing network info cache for instance {{(pid=71474) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 21 14:09:02 user nova-compute[71474]: DEBUG nova.network.neutron [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Updating instance_info_cache with network_info: [{"id": "7eb11528-a882-4084-a2c7-b36fd432fecf", "address": "fa:16:3e:05:9e:ea", "network": {"id": "d9138a89-3d80-4ef8-b937-1613f614c9e8", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-2095900346-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.101", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "41c39fcb224f4e69a73734be43ba6588", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7eb11528-a8", "ovs_interfaceid": "7eb11528-a882-4084-a2c7-b36fd432fecf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:09:02 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Releasing lock "refresh_cache-b5e2e065-1b7d-4cbf-b31a-923ae2f92fff" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 14:09:02 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Updated the network info_cache for instance {{(pid=71474) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 21 14:09:02 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:09:02 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:09:03 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:09:04 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:09:04 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:09:04 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:09:04 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Cleaning up deleted instances {{(pid=71474) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11079}} Apr 21 14:09:04 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] There are 0 instances to clean {{(pid=71474) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11088}} Apr 21 14:09:04 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:09:04 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Cleaning up deleted instances with incomplete migration {{(pid=71474) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11117}} Apr 21 14:09:05 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:09:06 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:09:06 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71474) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 21 14:09:09 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:09:12 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-b0e2096f-8790-45c5-a8c7-1c2902592676 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Acquiring lock "eb793e62-10c7-4bc3-834b-4a046bd33462" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:09:12 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-b0e2096f-8790-45c5-a8c7-1c2902592676 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Lock "eb793e62-10c7-4bc3-834b-4a046bd33462" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:09:12 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-b0e2096f-8790-45c5-a8c7-1c2902592676 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Acquiring lock "eb793e62-10c7-4bc3-834b-4a046bd33462-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:09:12 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-b0e2096f-8790-45c5-a8c7-1c2902592676 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Lock "eb793e62-10c7-4bc3-834b-4a046bd33462-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:09:12 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-b0e2096f-8790-45c5-a8c7-1c2902592676 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Lock "eb793e62-10c7-4bc3-834b-4a046bd33462-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:09:12 user nova-compute[71474]: INFO nova.compute.manager [None req-b0e2096f-8790-45c5-a8c7-1c2902592676 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] [instance: eb793e62-10c7-4bc3-834b-4a046bd33462] Terminating instance Apr 21 14:09:12 user nova-compute[71474]: DEBUG nova.compute.manager [None req-b0e2096f-8790-45c5-a8c7-1c2902592676 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] [instance: eb793e62-10c7-4bc3-834b-4a046bd33462] Start destroying the instance on the hypervisor. {{(pid=71474) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 21 14:09:13 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:09:13 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:09:13 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:09:13 user nova-compute[71474]: DEBUG nova.compute.manager [req-ee0b6e80-edbc-4d81-ade0-1e274f4a42a0 req-d1764f54-7b7e-4830-8acc-6a97221bf27d service nova] [instance: eb793e62-10c7-4bc3-834b-4a046bd33462] Received event network-vif-unplugged-5aa6dd25-1817-44da-9879-ccebac68be61 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:09:13 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-ee0b6e80-edbc-4d81-ade0-1e274f4a42a0 req-d1764f54-7b7e-4830-8acc-6a97221bf27d service nova] Acquiring lock "eb793e62-10c7-4bc3-834b-4a046bd33462-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:09:13 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-ee0b6e80-edbc-4d81-ade0-1e274f4a42a0 req-d1764f54-7b7e-4830-8acc-6a97221bf27d service nova] Lock "eb793e62-10c7-4bc3-834b-4a046bd33462-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:09:13 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-ee0b6e80-edbc-4d81-ade0-1e274f4a42a0 req-d1764f54-7b7e-4830-8acc-6a97221bf27d service nova] Lock "eb793e62-10c7-4bc3-834b-4a046bd33462-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:09:13 user nova-compute[71474]: DEBUG nova.compute.manager [req-ee0b6e80-edbc-4d81-ade0-1e274f4a42a0 req-d1764f54-7b7e-4830-8acc-6a97221bf27d service nova] [instance: eb793e62-10c7-4bc3-834b-4a046bd33462] No waiting events found dispatching network-vif-unplugged-5aa6dd25-1817-44da-9879-ccebac68be61 {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:09:13 user nova-compute[71474]: DEBUG nova.compute.manager [req-ee0b6e80-edbc-4d81-ade0-1e274f4a42a0 req-d1764f54-7b7e-4830-8acc-6a97221bf27d service nova] [instance: eb793e62-10c7-4bc3-834b-4a046bd33462] Received event network-vif-unplugged-5aa6dd25-1817-44da-9879-ccebac68be61 for instance with task_state deleting. {{(pid=71474) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 21 14:09:13 user nova-compute[71474]: INFO nova.virt.libvirt.driver [-] [instance: eb793e62-10c7-4bc3-834b-4a046bd33462] Instance destroyed successfully. Apr 21 14:09:13 user nova-compute[71474]: DEBUG nova.objects.instance [None req-b0e2096f-8790-45c5-a8c7-1c2902592676 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Lazy-loading 'resources' on Instance uuid eb793e62-10c7-4bc3-834b-4a046bd33462 {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 14:09:13 user nova-compute[71474]: DEBUG nova.virt.libvirt.vif [None req-b0e2096f-8790-45c5-a8c7-1c2902592676 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T14:07:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-TestMinimumBasicScenario-server-2134957043',display_name='tempest-TestMinimumBasicScenario-server-2134957043',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-testminimumbasicscenario-server-2134957043',id=23,image_ref='6d7eb54a-b068-4162-bd98-5a21649fbd2b',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCO55/5HhCQZkcFVfgVGrulI3atlpFKQjKnI78/kM4uimAx0bJQlbEfcxM2XQeW2c86vQX6nUo1+E0FJJHQNMap4lIvFfRYGCEovcOc6baFnAovI9vNKGBK33RQ9htMU1w==',key_name='tempest-TestMinimumBasicScenario-788075712',keypairs=,launch_index=0,launched_at=2023-04-21T14:07:27Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='cfa1f4e6f7864477b911420ea2ecb982',ramdisk_id='',reservation_id='r-bcbw0udz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6d7eb54a-b068-4162-bd98-5a21649fbd2b',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestMinimumBasicScenario-515927679',owner_user_name='tempest-TestMinimumBasicScenario-515927679-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-21T14:07:28Z,user_data=None,user_id='9d40cdc3312b43d286d8a79cde9f5418',uuid=eb793e62-10c7-4bc3-834b-4a046bd33462,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5aa6dd25-1817-44da-9879-ccebac68be61", "address": "fa:16:3e:03:9a:93", "network": {"id": "12b23d1e-f3a6-4c34-989d-1c89ee946e24", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1972551477-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cfa1f4e6f7864477b911420ea2ecb982", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5aa6dd25-18", "ovs_interfaceid": "5aa6dd25-1817-44da-9879-ccebac68be61", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 21 14:09:13 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-b0e2096f-8790-45c5-a8c7-1c2902592676 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Converting VIF {"id": "5aa6dd25-1817-44da-9879-ccebac68be61", "address": "fa:16:3e:03:9a:93", "network": {"id": "12b23d1e-f3a6-4c34-989d-1c89ee946e24", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1972551477-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cfa1f4e6f7864477b911420ea2ecb982", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5aa6dd25-18", "ovs_interfaceid": "5aa6dd25-1817-44da-9879-ccebac68be61", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 14:09:13 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-b0e2096f-8790-45c5-a8c7-1c2902592676 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:03:9a:93,bridge_name='br-int',has_traffic_filtering=True,id=5aa6dd25-1817-44da-9879-ccebac68be61,network=Network(12b23d1e-f3a6-4c34-989d-1c89ee946e24),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5aa6dd25-18') {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 14:09:13 user nova-compute[71474]: DEBUG os_vif [None req-b0e2096f-8790-45c5-a8c7-1c2902592676 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:03:9a:93,bridge_name='br-int',has_traffic_filtering=True,id=5aa6dd25-1817-44da-9879-ccebac68be61,network=Network(12b23d1e-f3a6-4c34-989d-1c89ee946e24),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5aa6dd25-18') {{(pid=71474) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 21 14:09:13 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:09:13 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5aa6dd25-18, bridge=br-int, if_exists=True) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 14:09:13 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:09:13 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 14:09:13 user nova-compute[71474]: INFO os_vif [None req-b0e2096f-8790-45c5-a8c7-1c2902592676 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:03:9a:93,bridge_name='br-int',has_traffic_filtering=True,id=5aa6dd25-1817-44da-9879-ccebac68be61,network=Network(12b23d1e-f3a6-4c34-989d-1c89ee946e24),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5aa6dd25-18') Apr 21 14:09:13 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-b0e2096f-8790-45c5-a8c7-1c2902592676 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] [instance: eb793e62-10c7-4bc3-834b-4a046bd33462] Deleting instance files /opt/stack/data/nova/instances/eb793e62-10c7-4bc3-834b-4a046bd33462_del Apr 21 14:09:13 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-b0e2096f-8790-45c5-a8c7-1c2902592676 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] [instance: eb793e62-10c7-4bc3-834b-4a046bd33462] Deletion of /opt/stack/data/nova/instances/eb793e62-10c7-4bc3-834b-4a046bd33462_del complete Apr 21 14:09:13 user nova-compute[71474]: INFO nova.compute.manager [None req-b0e2096f-8790-45c5-a8c7-1c2902592676 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] [instance: eb793e62-10c7-4bc3-834b-4a046bd33462] Took 0.63 seconds to destroy the instance on the hypervisor. Apr 21 14:09:13 user nova-compute[71474]: DEBUG oslo.service.loopingcall [None req-b0e2096f-8790-45c5-a8c7-1c2902592676 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71474) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 21 14:09:13 user nova-compute[71474]: DEBUG nova.compute.manager [-] [instance: eb793e62-10c7-4bc3-834b-4a046bd33462] Deallocating network for instance {{(pid=71474) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 21 14:09:13 user nova-compute[71474]: DEBUG nova.network.neutron [-] [instance: eb793e62-10c7-4bc3-834b-4a046bd33462] deallocate_for_instance() {{(pid=71474) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 21 14:09:13 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:09:14 user nova-compute[71474]: DEBUG nova.network.neutron [-] [instance: eb793e62-10c7-4bc3-834b-4a046bd33462] Updating instance_info_cache with network_info: [] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:09:14 user nova-compute[71474]: INFO nova.compute.manager [-] [instance: eb793e62-10c7-4bc3-834b-4a046bd33462] Took 0.48 seconds to deallocate network for instance. Apr 21 14:09:14 user nova-compute[71474]: DEBUG nova.compute.manager [req-88d422c3-8917-4025-a633-8b4508a92b2d req-0f2f1a73-2355-44f0-9ff7-ba7650c9c49f service nova] [instance: eb793e62-10c7-4bc3-834b-4a046bd33462] Received event network-vif-deleted-5aa6dd25-1817-44da-9879-ccebac68be61 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:09:14 user nova-compute[71474]: INFO nova.compute.manager [req-88d422c3-8917-4025-a633-8b4508a92b2d req-0f2f1a73-2355-44f0-9ff7-ba7650c9c49f service nova] [instance: eb793e62-10c7-4bc3-834b-4a046bd33462] Neutron deleted interface 5aa6dd25-1817-44da-9879-ccebac68be61; detaching it from the instance and deleting it from the info cache Apr 21 14:09:14 user nova-compute[71474]: DEBUG nova.network.neutron [req-88d422c3-8917-4025-a633-8b4508a92b2d req-0f2f1a73-2355-44f0-9ff7-ba7650c9c49f service nova] [instance: eb793e62-10c7-4bc3-834b-4a046bd33462] Updating instance_info_cache with network_info: [] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:09:14 user nova-compute[71474]: DEBUG nova.compute.manager [req-88d422c3-8917-4025-a633-8b4508a92b2d req-0f2f1a73-2355-44f0-9ff7-ba7650c9c49f service nova] [instance: eb793e62-10c7-4bc3-834b-4a046bd33462] Detach interface failed, port_id=5aa6dd25-1817-44da-9879-ccebac68be61, reason: Instance eb793e62-10c7-4bc3-834b-4a046bd33462 could not be found. {{(pid=71474) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:10816}} Apr 21 14:09:14 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-b0e2096f-8790-45c5-a8c7-1c2902592676 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:09:14 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-b0e2096f-8790-45c5-a8c7-1c2902592676 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:09:14 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-b0e2096f-8790-45c5-a8c7-1c2902592676 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Inventory has not changed in ProviderTree for provider: 4e62c1ab-67bb-43ed-8389-61deb50e98d7 {{(pid=71474) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 14:09:14 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-b0e2096f-8790-45c5-a8c7-1c2902592676 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Inventory has not changed for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 14:09:14 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-b0e2096f-8790-45c5-a8c7-1c2902592676 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.124s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:09:14 user nova-compute[71474]: INFO nova.scheduler.client.report [None req-b0e2096f-8790-45c5-a8c7-1c2902592676 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Deleted allocations for instance eb793e62-10c7-4bc3-834b-4a046bd33462 Apr 21 14:09:14 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-b0e2096f-8790-45c5-a8c7-1c2902592676 tempest-TestMinimumBasicScenario-515927679 tempest-TestMinimumBasicScenario-515927679-project-member] Lock "eb793e62-10c7-4bc3-834b-4a046bd33462" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.402s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:09:15 user nova-compute[71474]: DEBUG nova.compute.manager [req-e80d0b45-29bd-4377-b1b0-c5510bc7b410 req-c69f362e-5768-4418-a6fe-a9e60cc52890 service nova] [instance: eb793e62-10c7-4bc3-834b-4a046bd33462] Received event network-vif-plugged-5aa6dd25-1817-44da-9879-ccebac68be61 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:09:15 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-e80d0b45-29bd-4377-b1b0-c5510bc7b410 req-c69f362e-5768-4418-a6fe-a9e60cc52890 service nova] Acquiring lock "eb793e62-10c7-4bc3-834b-4a046bd33462-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:09:15 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-e80d0b45-29bd-4377-b1b0-c5510bc7b410 req-c69f362e-5768-4418-a6fe-a9e60cc52890 service nova] Lock "eb793e62-10c7-4bc3-834b-4a046bd33462-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:09:15 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-e80d0b45-29bd-4377-b1b0-c5510bc7b410 req-c69f362e-5768-4418-a6fe-a9e60cc52890 service nova] Lock "eb793e62-10c7-4bc3-834b-4a046bd33462-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:09:15 user nova-compute[71474]: DEBUG nova.compute.manager [req-e80d0b45-29bd-4377-b1b0-c5510bc7b410 req-c69f362e-5768-4418-a6fe-a9e60cc52890 service nova] [instance: eb793e62-10c7-4bc3-834b-4a046bd33462] No waiting events found dispatching network-vif-plugged-5aa6dd25-1817-44da-9879-ccebac68be61 {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:09:15 user nova-compute[71474]: WARNING nova.compute.manager [req-e80d0b45-29bd-4377-b1b0-c5510bc7b410 req-c69f362e-5768-4418-a6fe-a9e60cc52890 service nova] [instance: eb793e62-10c7-4bc3-834b-4a046bd33462] Received unexpected event network-vif-plugged-5aa6dd25-1817-44da-9879-ccebac68be61 for instance with vm_state deleted and task_state None. Apr 21 14:09:18 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:09:23 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 14:09:28 user nova-compute[71474]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71474) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 14:09:28 user nova-compute[71474]: INFO nova.compute.manager [-] [instance: eb793e62-10c7-4bc3-834b-4a046bd33462] VM Stopped (Lifecycle Event) Apr 21 14:09:28 user nova-compute[71474]: DEBUG nova.compute.manager [None req-b5c07be5-cdfe-4c08-a447-7f7b0c6275af None None] [instance: eb793e62-10c7-4bc3-834b-4a046bd33462] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 14:09:28 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 14:09:28 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:09:28 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71474) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 21 14:09:28 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71474) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 14:09:28 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71474) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 14:09:28 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:09:31 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Acquiring lock "4a52be06-ff23-47fa-8f3f-ecd2e4045df0" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:09:31 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Lock "4a52be06-ff23-47fa-8f3f-ecd2e4045df0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:09:31 user nova-compute[71474]: DEBUG nova.compute.manager [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] Starting instance... {{(pid=71474) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 21 14:09:32 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:09:32 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:09:32 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71474) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 21 14:09:32 user nova-compute[71474]: INFO nova.compute.claims [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] Claim successful on node user Apr 21 14:09:32 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Inventory has not changed in ProviderTree for provider: 4e62c1ab-67bb-43ed-8389-61deb50e98d7 {{(pid=71474) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 14:09:32 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Inventory has not changed for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 14:09:32 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.199s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:09:32 user nova-compute[71474]: DEBUG nova.compute.manager [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] Start building networks asynchronously for instance. {{(pid=71474) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 21 14:09:32 user nova-compute[71474]: DEBUG nova.compute.manager [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] Allocating IP information in the background. {{(pid=71474) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 21 14:09:32 user nova-compute[71474]: DEBUG nova.network.neutron [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] allocate_for_instance() {{(pid=71474) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 21 14:09:32 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 21 14:09:32 user nova-compute[71474]: DEBUG nova.compute.manager [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] Start building block device mappings for instance. {{(pid=71474) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 21 14:09:32 user nova-compute[71474]: DEBUG nova.policy [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ab1d2ed7df2f4a9bbf14da7e2c5fece2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f0ccc2c950364fcbb0f2b1cc937f6a82', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71474) authorize /opt/stack/nova/nova/policy.py:203}} Apr 21 14:09:32 user nova-compute[71474]: DEBUG nova.compute.manager [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] Start spawning the instance on the hypervisor. {{(pid=71474) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 21 14:09:32 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] Creating instance directory {{(pid=71474) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 21 14:09:32 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] Creating image(s) Apr 21 14:09:32 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Acquiring lock "/opt/stack/data/nova/instances/4a52be06-ff23-47fa-8f3f-ecd2e4045df0/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:09:32 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Lock "/opt/stack/data/nova/instances/4a52be06-ff23-47fa-8f3f-ecd2e4045df0/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:09:32 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Lock "/opt/stack/data/nova/instances/4a52be06-ff23-47fa-8f3f-ecd2e4045df0/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:09:32 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:09:32 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json" returned: 0 in 0.136s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:09:32 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Acquiring lock "8e8c288cb98f22f6af31ad55f38b7baa81c260d7" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:09:32 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Lock "8e8c288cb98f22f6af31ad55f38b7baa81c260d7" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:09:32 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:09:32 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json" returned: 0 in 0.176s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:09:32 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7,backing_fmt=raw /opt/stack/data/nova/instances/4a52be06-ff23-47fa-8f3f-ecd2e4045df0/disk 1073741824 {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:09:32 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7,backing_fmt=raw /opt/stack/data/nova/instances/4a52be06-ff23-47fa-8f3f-ecd2e4045df0/disk 1073741824" returned: 0 in 0.045s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:09:32 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Lock "8e8c288cb98f22f6af31ad55f38b7baa81c260d7" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.229s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:09:32 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:09:33 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json" returned: 0 in 0.127s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:09:33 user nova-compute[71474]: DEBUG nova.virt.disk.api [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Checking if we can resize image /opt/stack/data/nova/instances/4a52be06-ff23-47fa-8f3f-ecd2e4045df0/disk. size=1073741824 {{(pid=71474) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 21 14:09:33 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4a52be06-ff23-47fa-8f3f-ecd2e4045df0/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:09:33 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4a52be06-ff23-47fa-8f3f-ecd2e4045df0/disk --force-share --output=json" returned: 0 in 0.139s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:09:33 user nova-compute[71474]: DEBUG nova.virt.disk.api [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Cannot resize image /opt/stack/data/nova/instances/4a52be06-ff23-47fa-8f3f-ecd2e4045df0/disk to a smaller size. {{(pid=71474) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 21 14:09:33 user nova-compute[71474]: DEBUG nova.objects.instance [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Lazy-loading 'migration_context' on Instance uuid 4a52be06-ff23-47fa-8f3f-ecd2e4045df0 {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 14:09:33 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] Created local disks {{(pid=71474) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 21 14:09:33 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] Ensure instance console log exists: /opt/stack/data/nova/instances/4a52be06-ff23-47fa-8f3f-ecd2e4045df0/console.log {{(pid=71474) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 21 14:09:33 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:09:33 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:09:33 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:09:33 user nova-compute[71474]: DEBUG nova.network.neutron [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] Successfully created port: 2ae823dd-90e2-45a4-a300-3ea150d56569 {{(pid=71474) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 21 14:09:33 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:09:34 user nova-compute[71474]: DEBUG nova.network.neutron [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] Successfully updated port: 2ae823dd-90e2-45a4-a300-3ea150d56569 {{(pid=71474) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 21 14:09:34 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Acquiring lock "refresh_cache-4a52be06-ff23-47fa-8f3f-ecd2e4045df0" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 14:09:34 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Acquired lock "refresh_cache-4a52be06-ff23-47fa-8f3f-ecd2e4045df0" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 14:09:34 user nova-compute[71474]: DEBUG nova.network.neutron [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] Building network info cache for instance {{(pid=71474) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 21 14:09:34 user nova-compute[71474]: DEBUG nova.compute.manager [req-7126fc34-586f-4c40-ac58-decbeee3c2a4 req-deb743fe-fa39-4688-a14b-8e93c34b84ed service nova] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] Received event network-changed-2ae823dd-90e2-45a4-a300-3ea150d56569 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:09:34 user nova-compute[71474]: DEBUG nova.compute.manager [req-7126fc34-586f-4c40-ac58-decbeee3c2a4 req-deb743fe-fa39-4688-a14b-8e93c34b84ed service nova] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] Refreshing instance network info cache due to event network-changed-2ae823dd-90e2-45a4-a300-3ea150d56569. {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 21 14:09:34 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-7126fc34-586f-4c40-ac58-decbeee3c2a4 req-deb743fe-fa39-4688-a14b-8e93c34b84ed service nova] Acquiring lock "refresh_cache-4a52be06-ff23-47fa-8f3f-ecd2e4045df0" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 14:09:34 user nova-compute[71474]: DEBUG nova.network.neutron [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] Instance cache missing network info. {{(pid=71474) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 21 14:09:34 user nova-compute[71474]: DEBUG nova.network.neutron [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] Updating instance_info_cache with network_info: [{"id": "2ae823dd-90e2-45a4-a300-3ea150d56569", "address": "fa:16:3e:4d:ce:3c", "network": {"id": "31b07b9f-0a0f-426a-97d6-12b23e611818", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1809206062-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f0ccc2c950364fcbb0f2b1cc937f6a82", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ae823dd-90", "ovs_interfaceid": "2ae823dd-90e2-45a4-a300-3ea150d56569", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:09:34 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Releasing lock "refresh_cache-4a52be06-ff23-47fa-8f3f-ecd2e4045df0" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 14:09:34 user nova-compute[71474]: DEBUG nova.compute.manager [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] Instance network_info: |[{"id": "2ae823dd-90e2-45a4-a300-3ea150d56569", "address": "fa:16:3e:4d:ce:3c", "network": {"id": "31b07b9f-0a0f-426a-97d6-12b23e611818", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1809206062-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f0ccc2c950364fcbb0f2b1cc937f6a82", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ae823dd-90", "ovs_interfaceid": "2ae823dd-90e2-45a4-a300-3ea150d56569", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71474) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 21 14:09:34 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-7126fc34-586f-4c40-ac58-decbeee3c2a4 req-deb743fe-fa39-4688-a14b-8e93c34b84ed service nova] Acquired lock "refresh_cache-4a52be06-ff23-47fa-8f3f-ecd2e4045df0" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 14:09:34 user nova-compute[71474]: DEBUG nova.network.neutron [req-7126fc34-586f-4c40-ac58-decbeee3c2a4 req-deb743fe-fa39-4688-a14b-8e93c34b84ed service nova] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] Refreshing network info cache for port 2ae823dd-90e2-45a4-a300-3ea150d56569 {{(pid=71474) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 21 14:09:34 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] Start _get_guest_xml network_info=[{"id": "2ae823dd-90e2-45a4-a300-3ea150d56569", "address": "fa:16:3e:4d:ce:3c", "network": {"id": "31b07b9f-0a0f-426a-97d6-12b23e611818", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1809206062-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f0ccc2c950364fcbb0f2b1cc937f6a82", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ae823dd-90", "ovs_interfaceid": "2ae823dd-90e2-45a4-a300-3ea150d56569", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T13:54:16Z,direct_url=,disk_format='qcow2',id=2edfef44-2867-4e03-a53e-b139f99afa75,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='36a44032fda748c1965c722304fa176d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T13:54:18Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'guest_format': None, 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'image_id': '2edfef44-2867-4e03-a53e-b139f99afa75'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71474) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 21 14:09:34 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 14:09:34 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 14:09:34 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71474) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 21 14:09:34 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-21T13:55:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T13:54:16Z,direct_url=,disk_format='qcow2',id=2edfef44-2867-4e03-a53e-b139f99afa75,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='36a44032fda748c1965c722304fa176d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T13:54:18Z,virtual_size=,visibility=), allow threads: True {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 21 14:09:34 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Flavor limits 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 21 14:09:34 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Image limits 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 21 14:09:34 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Flavor pref 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 21 14:09:34 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Image pref 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 21 14:09:34 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 21 14:09:34 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 21 14:09:34 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71474) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 21 14:09:34 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Got 1 possible topologies {{(pid=71474) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 21 14:09:34 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 21 14:09:34 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 21 14:09:34 user nova-compute[71474]: DEBUG nova.virt.libvirt.vif [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T14:09:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-911419057',display_name='tempest-AttachVolumeNegativeTest-server-911419057',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-911419057',id=24,image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNh3h/ls9Vz2CSJMa9VpcyU76RLbRZmuJQ248SapA8bgNUNmhAl7IYjCs169Izl/iH7Dan2D0JBNyZec2ol4KFoZzjxYCOpbOA18fLh9nA4MO8xkGTV3gPFpF/O9TMHYqw==',key_name='tempest-keypair-1225044309',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f0ccc2c950364fcbb0f2b1cc937f6a82',ramdisk_id='',reservation_id='r-kf0xvnby',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-166063504',owner_user_name='tempest-AttachVolumeNegativeTest-166063504-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T14:09:33Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='ab1d2ed7df2f4a9bbf14da7e2c5fece2',uuid=4a52be06-ff23-47fa-8f3f-ecd2e4045df0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2ae823dd-90e2-45a4-a300-3ea150d56569", "address": "fa:16:3e:4d:ce:3c", "network": {"id": "31b07b9f-0a0f-426a-97d6-12b23e611818", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1809206062-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f0ccc2c950364fcbb0f2b1cc937f6a82", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ae823dd-90", "ovs_interfaceid": "2ae823dd-90e2-45a4-a300-3ea150d56569", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71474) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 21 14:09:34 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Converting VIF {"id": "2ae823dd-90e2-45a4-a300-3ea150d56569", "address": "fa:16:3e:4d:ce:3c", "network": {"id": "31b07b9f-0a0f-426a-97d6-12b23e611818", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1809206062-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f0ccc2c950364fcbb0f2b1cc937f6a82", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ae823dd-90", "ovs_interfaceid": "2ae823dd-90e2-45a4-a300-3ea150d56569", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 14:09:34 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4d:ce:3c,bridge_name='br-int',has_traffic_filtering=True,id=2ae823dd-90e2-45a4-a300-3ea150d56569,network=Network(31b07b9f-0a0f-426a-97d6-12b23e611818),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2ae823dd-90') {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 14:09:34 user nova-compute[71474]: DEBUG nova.objects.instance [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Lazy-loading 'pci_devices' on Instance uuid 4a52be06-ff23-47fa-8f3f-ecd2e4045df0 {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 14:09:34 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] End _get_guest_xml xml= Apr 21 14:09:34 user nova-compute[71474]: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0 Apr 21 14:09:34 user nova-compute[71474]: instance-00000018 Apr 21 14:09:34 user nova-compute[71474]: 131072 Apr 21 14:09:34 user nova-compute[71474]: 1 Apr 21 14:09:34 user nova-compute[71474]: Apr 21 14:09:34 user nova-compute[71474]: Apr 21 14:09:34 user nova-compute[71474]: Apr 21 14:09:34 user nova-compute[71474]: tempest-AttachVolumeNegativeTest-server-911419057 Apr 21 14:09:34 user nova-compute[71474]: 2023-04-21 14:09:34 Apr 21 14:09:34 user nova-compute[71474]: Apr 21 14:09:34 user nova-compute[71474]: 128 Apr 21 14:09:34 user nova-compute[71474]: 1 Apr 21 14:09:34 user nova-compute[71474]: 0 Apr 21 14:09:34 user nova-compute[71474]: 0 Apr 21 14:09:34 user nova-compute[71474]: 1 Apr 21 14:09:34 user nova-compute[71474]: Apr 21 14:09:34 user nova-compute[71474]: Apr 21 14:09:34 user nova-compute[71474]: tempest-AttachVolumeNegativeTest-166063504-project-member Apr 21 14:09:34 user nova-compute[71474]: tempest-AttachVolumeNegativeTest-166063504 Apr 21 14:09:34 user nova-compute[71474]: Apr 21 14:09:34 user nova-compute[71474]: Apr 21 14:09:34 user nova-compute[71474]: Apr 21 14:09:34 user nova-compute[71474]: Apr 21 14:09:34 user nova-compute[71474]: Apr 21 14:09:34 user nova-compute[71474]: Apr 21 14:09:34 user nova-compute[71474]: Apr 21 14:09:34 user nova-compute[71474]: Apr 21 14:09:34 user nova-compute[71474]: Apr 21 14:09:34 user nova-compute[71474]: Apr 21 14:09:34 user nova-compute[71474]: Apr 21 14:09:34 user nova-compute[71474]: OpenStack Foundation Apr 21 14:09:34 user nova-compute[71474]: OpenStack Nova Apr 21 14:09:34 user nova-compute[71474]: 0.0.0 Apr 21 14:09:34 user nova-compute[71474]: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0 Apr 21 14:09:34 user nova-compute[71474]: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0 Apr 21 14:09:34 user nova-compute[71474]: Virtual Machine Apr 21 14:09:34 user nova-compute[71474]: Apr 21 14:09:34 user nova-compute[71474]: Apr 21 14:09:34 user nova-compute[71474]: Apr 21 14:09:34 user nova-compute[71474]: hvm Apr 21 14:09:34 user nova-compute[71474]: Apr 21 14:09:34 user nova-compute[71474]: Apr 21 14:09:34 user nova-compute[71474]: Apr 21 14:09:34 user nova-compute[71474]: Apr 21 14:09:34 user nova-compute[71474]: Apr 21 14:09:34 user nova-compute[71474]: Apr 21 14:09:34 user nova-compute[71474]: Apr 21 14:09:34 user nova-compute[71474]: Apr 21 14:09:34 user nova-compute[71474]: Apr 21 14:09:34 user nova-compute[71474]: Apr 21 14:09:34 user nova-compute[71474]: Apr 21 14:09:34 user nova-compute[71474]: Apr 21 14:09:34 user nova-compute[71474]: Apr 21 14:09:34 user nova-compute[71474]: Apr 21 14:09:34 user nova-compute[71474]: Nehalem Apr 21 14:09:34 user nova-compute[71474]: Apr 21 14:09:34 user nova-compute[71474]: Apr 21 14:09:34 user nova-compute[71474]: Apr 21 14:09:34 user nova-compute[71474]: Apr 21 14:09:34 user nova-compute[71474]: Apr 21 14:09:34 user nova-compute[71474]: Apr 21 14:09:34 user nova-compute[71474]: Apr 21 14:09:34 user nova-compute[71474]: Apr 21 14:09:34 user nova-compute[71474]: Apr 21 14:09:34 user nova-compute[71474]: Apr 21 14:09:34 user nova-compute[71474]: Apr 21 14:09:34 user nova-compute[71474]: Apr 21 14:09:34 user nova-compute[71474]: Apr 21 14:09:34 user nova-compute[71474]: Apr 21 14:09:34 user nova-compute[71474]: Apr 21 14:09:34 user nova-compute[71474]: Apr 21 14:09:34 user nova-compute[71474]: Apr 21 14:09:34 user nova-compute[71474]: Apr 21 14:09:34 user nova-compute[71474]: Apr 21 14:09:34 user nova-compute[71474]: Apr 21 14:09:34 user nova-compute[71474]: /dev/urandom Apr 21 14:09:34 user nova-compute[71474]: Apr 21 14:09:34 user nova-compute[71474]: Apr 21 14:09:34 user nova-compute[71474]: Apr 21 14:09:34 user nova-compute[71474]: Apr 21 14:09:34 user nova-compute[71474]: Apr 21 14:09:34 user nova-compute[71474]: Apr 21 14:09:34 user nova-compute[71474]: Apr 21 14:09:34 user nova-compute[71474]: {{(pid=71474) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 21 14:09:34 user nova-compute[71474]: DEBUG nova.virt.libvirt.vif [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T14:09:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-911419057',display_name='tempest-AttachVolumeNegativeTest-server-911419057',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-911419057',id=24,image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNh3h/ls9Vz2CSJMa9VpcyU76RLbRZmuJQ248SapA8bgNUNmhAl7IYjCs169Izl/iH7Dan2D0JBNyZec2ol4KFoZzjxYCOpbOA18fLh9nA4MO8xkGTV3gPFpF/O9TMHYqw==',key_name='tempest-keypair-1225044309',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f0ccc2c950364fcbb0f2b1cc937f6a82',ramdisk_id='',reservation_id='r-kf0xvnby',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-166063504',owner_user_name='tempest-AttachVolumeNegativeTest-166063504-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T14:09:33Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='ab1d2ed7df2f4a9bbf14da7e2c5fece2',uuid=4a52be06-ff23-47fa-8f3f-ecd2e4045df0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2ae823dd-90e2-45a4-a300-3ea150d56569", "address": "fa:16:3e:4d:ce:3c", "network": {"id": "31b07b9f-0a0f-426a-97d6-12b23e611818", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1809206062-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f0ccc2c950364fcbb0f2b1cc937f6a82", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ae823dd-90", "ovs_interfaceid": "2ae823dd-90e2-45a4-a300-3ea150d56569", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 21 14:09:34 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Converting VIF {"id": "2ae823dd-90e2-45a4-a300-3ea150d56569", "address": "fa:16:3e:4d:ce:3c", "network": {"id": "31b07b9f-0a0f-426a-97d6-12b23e611818", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1809206062-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f0ccc2c950364fcbb0f2b1cc937f6a82", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ae823dd-90", "ovs_interfaceid": "2ae823dd-90e2-45a4-a300-3ea150d56569", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 14:09:34 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4d:ce:3c,bridge_name='br-int',has_traffic_filtering=True,id=2ae823dd-90e2-45a4-a300-3ea150d56569,network=Network(31b07b9f-0a0f-426a-97d6-12b23e611818),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2ae823dd-90') {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 14:09:34 user nova-compute[71474]: DEBUG os_vif [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4d:ce:3c,bridge_name='br-int',has_traffic_filtering=True,id=2ae823dd-90e2-45a4-a300-3ea150d56569,network=Network(31b07b9f-0a0f-426a-97d6-12b23e611818),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2ae823dd-90') {{(pid=71474) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 21 14:09:34 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:09:34 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 14:09:34 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 21 14:09:34 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:09:34 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2ae823dd-90, may_exist=True) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 14:09:34 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2ae823dd-90, col_values=(('external_ids', {'iface-id': '2ae823dd-90e2-45a4-a300-3ea150d56569', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:4d:ce:3c', 'vm-uuid': '4a52be06-ff23-47fa-8f3f-ecd2e4045df0'}),)) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 14:09:34 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:09:34 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 14:09:34 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:09:34 user nova-compute[71474]: INFO os_vif [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4d:ce:3c,bridge_name='br-int',has_traffic_filtering=True,id=2ae823dd-90e2-45a4-a300-3ea150d56569,network=Network(31b07b9f-0a0f-426a-97d6-12b23e611818),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2ae823dd-90') Apr 21 14:09:34 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] No BDM found with device name vda, not building metadata. {{(pid=71474) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 21 14:09:34 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] No VIF found with MAC fa:16:3e:4d:ce:3c, not building metadata {{(pid=71474) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 21 14:09:35 user nova-compute[71474]: DEBUG nova.network.neutron [req-7126fc34-586f-4c40-ac58-decbeee3c2a4 req-deb743fe-fa39-4688-a14b-8e93c34b84ed service nova] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] Updated VIF entry in instance network info cache for port 2ae823dd-90e2-45a4-a300-3ea150d56569. {{(pid=71474) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 21 14:09:35 user nova-compute[71474]: DEBUG nova.network.neutron [req-7126fc34-586f-4c40-ac58-decbeee3c2a4 req-deb743fe-fa39-4688-a14b-8e93c34b84ed service nova] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] Updating instance_info_cache with network_info: [{"id": "2ae823dd-90e2-45a4-a300-3ea150d56569", "address": "fa:16:3e:4d:ce:3c", "network": {"id": "31b07b9f-0a0f-426a-97d6-12b23e611818", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1809206062-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f0ccc2c950364fcbb0f2b1cc937f6a82", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ae823dd-90", "ovs_interfaceid": "2ae823dd-90e2-45a4-a300-3ea150d56569", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:09:35 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-7126fc34-586f-4c40-ac58-decbeee3c2a4 req-deb743fe-fa39-4688-a14b-8e93c34b84ed service nova] Releasing lock "refresh_cache-4a52be06-ff23-47fa-8f3f-ecd2e4045df0" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 14:09:36 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:09:36 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:09:36 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:09:36 user nova-compute[71474]: DEBUG nova.compute.manager [req-82321f00-08b6-44f8-b94b-a3a3b53a3925 req-fea7597a-a650-47ab-a2b9-38ba16825999 service nova] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] Received event network-vif-plugged-2ae823dd-90e2-45a4-a300-3ea150d56569 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:09:36 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-82321f00-08b6-44f8-b94b-a3a3b53a3925 req-fea7597a-a650-47ab-a2b9-38ba16825999 service nova] Acquiring lock "4a52be06-ff23-47fa-8f3f-ecd2e4045df0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:09:36 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-82321f00-08b6-44f8-b94b-a3a3b53a3925 req-fea7597a-a650-47ab-a2b9-38ba16825999 service nova] Lock "4a52be06-ff23-47fa-8f3f-ecd2e4045df0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:09:36 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-82321f00-08b6-44f8-b94b-a3a3b53a3925 req-fea7597a-a650-47ab-a2b9-38ba16825999 service nova] Lock "4a52be06-ff23-47fa-8f3f-ecd2e4045df0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:09:36 user nova-compute[71474]: DEBUG nova.compute.manager [req-82321f00-08b6-44f8-b94b-a3a3b53a3925 req-fea7597a-a650-47ab-a2b9-38ba16825999 service nova] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] No waiting events found dispatching network-vif-plugged-2ae823dd-90e2-45a4-a300-3ea150d56569 {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:09:36 user nova-compute[71474]: WARNING nova.compute.manager [req-82321f00-08b6-44f8-b94b-a3a3b53a3925 req-fea7597a-a650-47ab-a2b9-38ba16825999 service nova] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] Received unexpected event network-vif-plugged-2ae823dd-90e2-45a4-a300-3ea150d56569 for instance with vm_state building and task_state spawning. Apr 21 14:09:36 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:09:37 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:09:37 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:09:37 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:09:37 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:09:38 user nova-compute[71474]: DEBUG nova.virt.driver [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Emitting event Resumed> {{(pid=71474) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 14:09:38 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] VM Resumed (Lifecycle Event) Apr 21 14:09:38 user nova-compute[71474]: DEBUG nova.compute.manager [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] Instance event wait completed in 0 seconds for {{(pid=71474) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 21 14:09:38 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] Guest created on hypervisor {{(pid=71474) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 21 14:09:38 user nova-compute[71474]: INFO nova.virt.libvirt.driver [-] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] Instance spawned successfully. Apr 21 14:09:38 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 21 14:09:38 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 14:09:38 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71474) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 14:09:38 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] Found default for hw_cdrom_bus of ide {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:09:38 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] Found default for hw_disk_bus of virtio {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:09:38 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] Found default for hw_input_bus of None {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:09:38 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] Found default for hw_pointer_model of None {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:09:38 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] Found default for hw_video_model of virtio {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:09:38 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] Found default for hw_vif_model of virtio {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:09:38 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 14:09:38 user nova-compute[71474]: DEBUG nova.virt.driver [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Emitting event Started> {{(pid=71474) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 14:09:38 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] VM Started (Lifecycle Event) Apr 21 14:09:38 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 14:09:38 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71474) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 14:09:38 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 14:09:38 user nova-compute[71474]: INFO nova.compute.manager [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] Took 6.04 seconds to spawn the instance on the hypervisor. Apr 21 14:09:38 user nova-compute[71474]: DEBUG nova.compute.manager [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 14:09:38 user nova-compute[71474]: DEBUG nova.compute.manager [req-24937e3c-8189-4867-b3d9-a9c7f1e9f6ea req-8380ef52-c49e-4d22-9422-98e25fcac1d4 service nova] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] Received event network-vif-plugged-2ae823dd-90e2-45a4-a300-3ea150d56569 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:09:38 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-24937e3c-8189-4867-b3d9-a9c7f1e9f6ea req-8380ef52-c49e-4d22-9422-98e25fcac1d4 service nova] Acquiring lock "4a52be06-ff23-47fa-8f3f-ecd2e4045df0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:09:38 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-24937e3c-8189-4867-b3d9-a9c7f1e9f6ea req-8380ef52-c49e-4d22-9422-98e25fcac1d4 service nova] Lock "4a52be06-ff23-47fa-8f3f-ecd2e4045df0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:09:38 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-24937e3c-8189-4867-b3d9-a9c7f1e9f6ea req-8380ef52-c49e-4d22-9422-98e25fcac1d4 service nova] Lock "4a52be06-ff23-47fa-8f3f-ecd2e4045df0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:09:38 user nova-compute[71474]: DEBUG nova.compute.manager [req-24937e3c-8189-4867-b3d9-a9c7f1e9f6ea req-8380ef52-c49e-4d22-9422-98e25fcac1d4 service nova] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] No waiting events found dispatching network-vif-plugged-2ae823dd-90e2-45a4-a300-3ea150d56569 {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:09:38 user nova-compute[71474]: WARNING nova.compute.manager [req-24937e3c-8189-4867-b3d9-a9c7f1e9f6ea req-8380ef52-c49e-4d22-9422-98e25fcac1d4 service nova] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] Received unexpected event network-vif-plugged-2ae823dd-90e2-45a4-a300-3ea150d56569 for instance with vm_state building and task_state spawning. Apr 21 14:09:38 user nova-compute[71474]: INFO nova.compute.manager [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] Took 6.59 seconds to build instance. Apr 21 14:09:38 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-f53650d5-49c4-4775-86b0-98020eb97ff7 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Lock "4a52be06-ff23-47fa-8f3f-ecd2e4045df0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.996s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:09:39 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:09:41 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:09:44 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:09:46 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:09:49 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:09:51 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:09:54 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:09:56 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:09:58 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:09:59 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:10:00 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:10:00 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:10:00 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:10:00 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:10:00 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:10:00 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Auditing locally available compute resources for user (node: user) {{(pid=71474) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 21 14:10:00 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4a52be06-ff23-47fa-8f3f-ecd2e4045df0/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:10:01 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4a52be06-ff23-47fa-8f3f-ecd2e4045df0/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:10:01 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4a52be06-ff23-47fa-8f3f-ecd2e4045df0/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:10:01 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:10:01 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4a52be06-ff23-47fa-8f3f-ecd2e4045df0/disk --force-share --output=json" returned: 0 in 0.132s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:10:01 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b5e2e065-1b7d-4cbf-b31a-923ae2f92fff/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:10:01 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b5e2e065-1b7d-4cbf-b31a-923ae2f92fff/disk --force-share --output=json" returned: 0 in 0.139s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:10:01 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b5e2e065-1b7d-4cbf-b31a-923ae2f92fff/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:10:01 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b5e2e065-1b7d-4cbf-b31a-923ae2f92fff/disk --force-share --output=json" returned: 0 in 0.132s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:10:01 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 14:10:01 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 14:10:01 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Hypervisor/Node resource view: name=user free_ram=8872MB free_disk=26.069454193115234GB free_vcpus=10 pci_devices=[{"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71474) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 21 14:10:01 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:10:01 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:10:02 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Instance b5e2e065-1b7d-4cbf-b31a-923ae2f92fff actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71474) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 14:10:02 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Instance 4a52be06-ff23-47fa-8f3f-ecd2e4045df0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71474) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 14:10:02 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Total usable vcpus: 12, total allocated vcpus: 2 {{(pid=71474) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 21 14:10:02 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Final resource view: name=user phys_ram=16023MB used_ram=768MB phys_disk=40GB used_disk=2GB total_vcpus=12 used_vcpus=2 pci_stats=[] {{(pid=71474) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 21 14:10:02 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Inventory has not changed in ProviderTree for provider: 4e62c1ab-67bb-43ed-8389-61deb50e98d7 {{(pid=71474) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 14:10:02 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Inventory has not changed for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 14:10:02 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Compute_service record updated for user:user {{(pid=71474) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 21 14:10:02 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.221s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:10:03 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:10:03 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:10:03 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Starting heal instance info cache {{(pid=71474) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 21 14:10:03 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Rebuilding the list of instances to heal {{(pid=71474) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 21 14:10:03 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquiring lock "refresh_cache-b5e2e065-1b7d-4cbf-b31a-923ae2f92fff" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 14:10:03 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquired lock "refresh_cache-b5e2e065-1b7d-4cbf-b31a-923ae2f92fff" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 14:10:03 user nova-compute[71474]: DEBUG nova.network.neutron [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Forcefully refreshing network info cache for instance {{(pid=71474) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 21 14:10:03 user nova-compute[71474]: DEBUG nova.objects.instance [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lazy-loading 'info_cache' on Instance uuid b5e2e065-1b7d-4cbf-b31a-923ae2f92fff {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 14:10:03 user nova-compute[71474]: DEBUG nova.network.neutron [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Updating instance_info_cache with network_info: [{"id": "7eb11528-a882-4084-a2c7-b36fd432fecf", "address": "fa:16:3e:05:9e:ea", "network": {"id": "d9138a89-3d80-4ef8-b937-1613f614c9e8", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-2095900346-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.101", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "41c39fcb224f4e69a73734be43ba6588", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7eb11528-a8", "ovs_interfaceid": "7eb11528-a882-4084-a2c7-b36fd432fecf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:10:03 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Releasing lock "refresh_cache-b5e2e065-1b7d-4cbf-b31a-923ae2f92fff" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 14:10:03 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Updated the network info_cache for instance {{(pid=71474) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 21 14:10:03 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:10:04 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:10:04 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:10:04 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:10:05 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:10:05 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:10:06 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:10:08 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:10:08 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:10:08 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71474) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 21 14:10:09 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:10:14 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:10:19 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:10:21 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:10:24 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:10:29 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 14:10:34 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 14:10:39 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:10:44 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 14:10:49 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:10:51 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:10:54 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:10:59 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:10:59 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 14:10:59 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:10:59 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe {{(pid=71474) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 21 14:10:59 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71474) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 14:10:59 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71474) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 14:10:59 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:11:00 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:11:02 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:11:02 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:11:02 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:11:02 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:11:02 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:11:02 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Auditing locally available compute resources for user (node: user) {{(pid=71474) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 21 14:11:02 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4a52be06-ff23-47fa-8f3f-ecd2e4045df0/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:11:03 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4a52be06-ff23-47fa-8f3f-ecd2e4045df0/disk --force-share --output=json" returned: 0 in 0.131s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:11:03 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4a52be06-ff23-47fa-8f3f-ecd2e4045df0/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:11:03 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4a52be06-ff23-47fa-8f3f-ecd2e4045df0/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:11:03 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b5e2e065-1b7d-4cbf-b31a-923ae2f92fff/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:11:03 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b5e2e065-1b7d-4cbf-b31a-923ae2f92fff/disk --force-share --output=json" returned: 0 in 0.137s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:11:03 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b5e2e065-1b7d-4cbf-b31a-923ae2f92fff/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:11:03 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b5e2e065-1b7d-4cbf-b31a-923ae2f92fff/disk --force-share --output=json" returned: 0 in 0.148s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:11:03 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 14:11:03 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 14:11:03 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Hypervisor/Node resource view: name=user free_ram=8974MB free_disk=26.090839385986328GB free_vcpus=10 pci_devices=[{"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71474) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 21 14:11:03 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:11:03 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:11:03 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Instance b5e2e065-1b7d-4cbf-b31a-923ae2f92fff actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71474) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 14:11:03 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Instance 4a52be06-ff23-47fa-8f3f-ecd2e4045df0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71474) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 14:11:03 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Total usable vcpus: 12, total allocated vcpus: 2 {{(pid=71474) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 21 14:11:03 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Final resource view: name=user phys_ram=16023MB used_ram=768MB phys_disk=40GB used_disk=2GB total_vcpus=12 used_vcpus=2 pci_stats=[] {{(pid=71474) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 21 14:11:04 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Inventory has not changed in ProviderTree for provider: 4e62c1ab-67bb-43ed-8389-61deb50e98d7 {{(pid=71474) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 14:11:04 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Inventory has not changed for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 14:11:04 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Compute_service record updated for user:user {{(pid=71474) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 21 14:11:04 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.227s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:11:04 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:11:05 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:11:05 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Starting heal instance info cache {{(pid=71474) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 21 14:11:05 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquiring lock "refresh_cache-4a52be06-ff23-47fa-8f3f-ecd2e4045df0" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 14:11:05 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquired lock "refresh_cache-4a52be06-ff23-47fa-8f3f-ecd2e4045df0" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 14:11:05 user nova-compute[71474]: DEBUG nova.network.neutron [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] Forcefully refreshing network info cache for instance {{(pid=71474) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 21 14:11:05 user nova-compute[71474]: DEBUG nova.network.neutron [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] Updating instance_info_cache with network_info: [{"id": "2ae823dd-90e2-45a4-a300-3ea150d56569", "address": "fa:16:3e:4d:ce:3c", "network": {"id": "31b07b9f-0a0f-426a-97d6-12b23e611818", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1809206062-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f0ccc2c950364fcbb0f2b1cc937f6a82", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ae823dd-90", "ovs_interfaceid": "2ae823dd-90e2-45a4-a300-3ea150d56569", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:11:05 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Releasing lock "refresh_cache-4a52be06-ff23-47fa-8f3f-ecd2e4045df0" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 14:11:05 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] Updated the network info_cache for instance {{(pid=71474) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 21 14:11:06 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:11:06 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:11:06 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:11:09 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 14:11:10 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:11:10 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71474) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 21 14:11:14 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:11:19 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:11:22 user nova-compute[71474]: DEBUG nova.compute.manager [req-fb983fb5-56a4-4a18-8fe9-a8ec09fd1fae req-bc684b89-c159-498f-b7d6-a1c79b83e6ea service nova] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] Received event network-changed-2ae823dd-90e2-45a4-a300-3ea150d56569 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:11:22 user nova-compute[71474]: DEBUG nova.compute.manager [req-fb983fb5-56a4-4a18-8fe9-a8ec09fd1fae req-bc684b89-c159-498f-b7d6-a1c79b83e6ea service nova] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] Refreshing instance network info cache due to event network-changed-2ae823dd-90e2-45a4-a300-3ea150d56569. {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 21 14:11:22 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-fb983fb5-56a4-4a18-8fe9-a8ec09fd1fae req-bc684b89-c159-498f-b7d6-a1c79b83e6ea service nova] Acquiring lock "refresh_cache-4a52be06-ff23-47fa-8f3f-ecd2e4045df0" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 14:11:22 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-fb983fb5-56a4-4a18-8fe9-a8ec09fd1fae req-bc684b89-c159-498f-b7d6-a1c79b83e6ea service nova] Acquired lock "refresh_cache-4a52be06-ff23-47fa-8f3f-ecd2e4045df0" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 14:11:22 user nova-compute[71474]: DEBUG nova.network.neutron [req-fb983fb5-56a4-4a18-8fe9-a8ec09fd1fae req-bc684b89-c159-498f-b7d6-a1c79b83e6ea service nova] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] Refreshing network info cache for port 2ae823dd-90e2-45a4-a300-3ea150d56569 {{(pid=71474) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 21 14:11:23 user nova-compute[71474]: DEBUG nova.network.neutron [req-fb983fb5-56a4-4a18-8fe9-a8ec09fd1fae req-bc684b89-c159-498f-b7d6-a1c79b83e6ea service nova] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] Updated VIF entry in instance network info cache for port 2ae823dd-90e2-45a4-a300-3ea150d56569. {{(pid=71474) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 21 14:11:23 user nova-compute[71474]: DEBUG nova.network.neutron [req-fb983fb5-56a4-4a18-8fe9-a8ec09fd1fae req-bc684b89-c159-498f-b7d6-a1c79b83e6ea service nova] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] Updating instance_info_cache with network_info: [{"id": "2ae823dd-90e2-45a4-a300-3ea150d56569", "address": "fa:16:3e:4d:ce:3c", "network": {"id": "31b07b9f-0a0f-426a-97d6-12b23e611818", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1809206062-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.253", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f0ccc2c950364fcbb0f2b1cc937f6a82", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ae823dd-90", "ovs_interfaceid": "2ae823dd-90e2-45a4-a300-3ea150d56569", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:11:23 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-fb983fb5-56a4-4a18-8fe9-a8ec09fd1fae req-bc684b89-c159-498f-b7d6-a1c79b83e6ea service nova] Releasing lock "refresh_cache-4a52be06-ff23-47fa-8f3f-ecd2e4045df0" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 14:11:24 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-b33eeaa5-4d8a-48b3-821c-1df8d34b9e25 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Acquiring lock "4a52be06-ff23-47fa-8f3f-ecd2e4045df0" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:11:24 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-b33eeaa5-4d8a-48b3-821c-1df8d34b9e25 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Lock "4a52be06-ff23-47fa-8f3f-ecd2e4045df0" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:11:24 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-b33eeaa5-4d8a-48b3-821c-1df8d34b9e25 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Acquiring lock "4a52be06-ff23-47fa-8f3f-ecd2e4045df0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:11:24 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-b33eeaa5-4d8a-48b3-821c-1df8d34b9e25 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Lock "4a52be06-ff23-47fa-8f3f-ecd2e4045df0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:11:24 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-b33eeaa5-4d8a-48b3-821c-1df8d34b9e25 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Lock "4a52be06-ff23-47fa-8f3f-ecd2e4045df0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:11:24 user nova-compute[71474]: INFO nova.compute.manager [None req-b33eeaa5-4d8a-48b3-821c-1df8d34b9e25 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] Terminating instance Apr 21 14:11:24 user nova-compute[71474]: DEBUG nova.compute.manager [None req-b33eeaa5-4d8a-48b3-821c-1df8d34b9e25 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] Start destroying the instance on the hypervisor. {{(pid=71474) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 21 14:11:24 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:11:24 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:11:24 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:11:24 user nova-compute[71474]: INFO nova.virt.libvirt.driver [-] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] Instance destroyed successfully. Apr 21 14:11:24 user nova-compute[71474]: DEBUG nova.objects.instance [None req-b33eeaa5-4d8a-48b3-821c-1df8d34b9e25 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Lazy-loading 'resources' on Instance uuid 4a52be06-ff23-47fa-8f3f-ecd2e4045df0 {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 14:11:24 user nova-compute[71474]: DEBUG nova.virt.libvirt.vif [None req-b33eeaa5-4d8a-48b3-821c-1df8d34b9e25 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T14:09:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-911419057',display_name='tempest-AttachVolumeNegativeTest-server-911419057',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-911419057',id=24,image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNh3h/ls9Vz2CSJMa9VpcyU76RLbRZmuJQ248SapA8bgNUNmhAl7IYjCs169Izl/iH7Dan2D0JBNyZec2ol4KFoZzjxYCOpbOA18fLh9nA4MO8xkGTV3gPFpF/O9TMHYqw==',key_name='tempest-keypair-1225044309',keypairs=,launch_index=0,launched_at=2023-04-21T14:09:38Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='f0ccc2c950364fcbb0f2b1cc937f6a82',ramdisk_id='',reservation_id='r-kf0xvnby',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-AttachVolumeNegativeTest-166063504',owner_user_name='tempest-AttachVolumeNegativeTest-166063504-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-21T14:09:39Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='ab1d2ed7df2f4a9bbf14da7e2c5fece2',uuid=4a52be06-ff23-47fa-8f3f-ecd2e4045df0,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2ae823dd-90e2-45a4-a300-3ea150d56569", "address": "fa:16:3e:4d:ce:3c", "network": {"id": "31b07b9f-0a0f-426a-97d6-12b23e611818", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1809206062-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.253", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f0ccc2c950364fcbb0f2b1cc937f6a82", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ae823dd-90", "ovs_interfaceid": "2ae823dd-90e2-45a4-a300-3ea150d56569", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 21 14:11:24 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-b33eeaa5-4d8a-48b3-821c-1df8d34b9e25 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Converting VIF {"id": "2ae823dd-90e2-45a4-a300-3ea150d56569", "address": "fa:16:3e:4d:ce:3c", "network": {"id": "31b07b9f-0a0f-426a-97d6-12b23e611818", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1809206062-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.253", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f0ccc2c950364fcbb0f2b1cc937f6a82", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ae823dd-90", "ovs_interfaceid": "2ae823dd-90e2-45a4-a300-3ea150d56569", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 14:11:24 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-b33eeaa5-4d8a-48b3-821c-1df8d34b9e25 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:4d:ce:3c,bridge_name='br-int',has_traffic_filtering=True,id=2ae823dd-90e2-45a4-a300-3ea150d56569,network=Network(31b07b9f-0a0f-426a-97d6-12b23e611818),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2ae823dd-90') {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 14:11:24 user nova-compute[71474]: DEBUG os_vif [None req-b33eeaa5-4d8a-48b3-821c-1df8d34b9e25 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:4d:ce:3c,bridge_name='br-int',has_traffic_filtering=True,id=2ae823dd-90e2-45a4-a300-3ea150d56569,network=Network(31b07b9f-0a0f-426a-97d6-12b23e611818),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2ae823dd-90') {{(pid=71474) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 21 14:11:24 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:11:24 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:11:24 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2ae823dd-90, bridge=br-int, if_exists=True) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 14:11:24 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:11:24 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 14:11:24 user nova-compute[71474]: INFO os_vif [None req-b33eeaa5-4d8a-48b3-821c-1df8d34b9e25 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:4d:ce:3c,bridge_name='br-int',has_traffic_filtering=True,id=2ae823dd-90e2-45a4-a300-3ea150d56569,network=Network(31b07b9f-0a0f-426a-97d6-12b23e611818),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2ae823dd-90') Apr 21 14:11:24 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-b33eeaa5-4d8a-48b3-821c-1df8d34b9e25 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] Deleting instance files /opt/stack/data/nova/instances/4a52be06-ff23-47fa-8f3f-ecd2e4045df0_del Apr 21 14:11:24 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-b33eeaa5-4d8a-48b3-821c-1df8d34b9e25 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] Deletion of /opt/stack/data/nova/instances/4a52be06-ff23-47fa-8f3f-ecd2e4045df0_del complete Apr 21 14:11:25 user nova-compute[71474]: DEBUG nova.compute.manager [req-9eff4fd3-93d1-4b20-ae7c-27f30c2a743e req-000e5176-02c8-47cb-80c0-67b2d9b225e2 service nova] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] Received event network-vif-unplugged-2ae823dd-90e2-45a4-a300-3ea150d56569 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:11:25 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-9eff4fd3-93d1-4b20-ae7c-27f30c2a743e req-000e5176-02c8-47cb-80c0-67b2d9b225e2 service nova] Acquiring lock "4a52be06-ff23-47fa-8f3f-ecd2e4045df0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:11:25 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-9eff4fd3-93d1-4b20-ae7c-27f30c2a743e req-000e5176-02c8-47cb-80c0-67b2d9b225e2 service nova] Lock "4a52be06-ff23-47fa-8f3f-ecd2e4045df0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:11:25 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-9eff4fd3-93d1-4b20-ae7c-27f30c2a743e req-000e5176-02c8-47cb-80c0-67b2d9b225e2 service nova] Lock "4a52be06-ff23-47fa-8f3f-ecd2e4045df0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:11:25 user nova-compute[71474]: DEBUG nova.compute.manager [req-9eff4fd3-93d1-4b20-ae7c-27f30c2a743e req-000e5176-02c8-47cb-80c0-67b2d9b225e2 service nova] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] No waiting events found dispatching network-vif-unplugged-2ae823dd-90e2-45a4-a300-3ea150d56569 {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:11:25 user nova-compute[71474]: DEBUG nova.compute.manager [req-9eff4fd3-93d1-4b20-ae7c-27f30c2a743e req-000e5176-02c8-47cb-80c0-67b2d9b225e2 service nova] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] Received event network-vif-unplugged-2ae823dd-90e2-45a4-a300-3ea150d56569 for instance with task_state deleting. {{(pid=71474) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 21 14:11:25 user nova-compute[71474]: INFO nova.compute.manager [None req-b33eeaa5-4d8a-48b3-821c-1df8d34b9e25 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] Took 0.67 seconds to destroy the instance on the hypervisor. Apr 21 14:11:25 user nova-compute[71474]: DEBUG oslo.service.loopingcall [None req-b33eeaa5-4d8a-48b3-821c-1df8d34b9e25 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71474) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 21 14:11:25 user nova-compute[71474]: DEBUG nova.compute.manager [-] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] Deallocating network for instance {{(pid=71474) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 21 14:11:25 user nova-compute[71474]: DEBUG nova.network.neutron [-] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] deallocate_for_instance() {{(pid=71474) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 21 14:11:25 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:11:26 user nova-compute[71474]: DEBUG nova.network.neutron [-] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] Updating instance_info_cache with network_info: [] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:11:26 user nova-compute[71474]: INFO nova.compute.manager [-] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] Took 1.01 seconds to deallocate network for instance. Apr 21 14:11:26 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:11:26 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-b33eeaa5-4d8a-48b3-821c-1df8d34b9e25 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:11:26 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-b33eeaa5-4d8a-48b3-821c-1df8d34b9e25 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:11:26 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-b33eeaa5-4d8a-48b3-821c-1df8d34b9e25 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Inventory has not changed in ProviderTree for provider: 4e62c1ab-67bb-43ed-8389-61deb50e98d7 {{(pid=71474) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 14:11:26 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-b33eeaa5-4d8a-48b3-821c-1df8d34b9e25 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Inventory has not changed for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 14:11:26 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-b33eeaa5-4d8a-48b3-821c-1df8d34b9e25 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.158s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:11:26 user nova-compute[71474]: INFO nova.scheduler.client.report [None req-b33eeaa5-4d8a-48b3-821c-1df8d34b9e25 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Deleted allocations for instance 4a52be06-ff23-47fa-8f3f-ecd2e4045df0 Apr 21 14:11:26 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-b33eeaa5-4d8a-48b3-821c-1df8d34b9e25 tempest-AttachVolumeNegativeTest-166063504 tempest-AttachVolumeNegativeTest-166063504-project-member] Lock "4a52be06-ff23-47fa-8f3f-ecd2e4045df0" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.094s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:11:27 user nova-compute[71474]: DEBUG nova.compute.manager [req-671231d5-8d93-40a4-9b97-5e422c01a93f req-fc0ad880-5d69-4a26-96f8-56682085fdee service nova] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] Received event network-vif-plugged-2ae823dd-90e2-45a4-a300-3ea150d56569 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:11:27 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-671231d5-8d93-40a4-9b97-5e422c01a93f req-fc0ad880-5d69-4a26-96f8-56682085fdee service nova] Acquiring lock "4a52be06-ff23-47fa-8f3f-ecd2e4045df0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:11:27 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-671231d5-8d93-40a4-9b97-5e422c01a93f req-fc0ad880-5d69-4a26-96f8-56682085fdee service nova] Lock "4a52be06-ff23-47fa-8f3f-ecd2e4045df0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:11:27 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-671231d5-8d93-40a4-9b97-5e422c01a93f req-fc0ad880-5d69-4a26-96f8-56682085fdee service nova] Lock "4a52be06-ff23-47fa-8f3f-ecd2e4045df0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:11:27 user nova-compute[71474]: DEBUG nova.compute.manager [req-671231d5-8d93-40a4-9b97-5e422c01a93f req-fc0ad880-5d69-4a26-96f8-56682085fdee service nova] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] No waiting events found dispatching network-vif-plugged-2ae823dd-90e2-45a4-a300-3ea150d56569 {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:11:27 user nova-compute[71474]: WARNING nova.compute.manager [req-671231d5-8d93-40a4-9b97-5e422c01a93f req-fc0ad880-5d69-4a26-96f8-56682085fdee service nova] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] Received unexpected event network-vif-plugged-2ae823dd-90e2-45a4-a300-3ea150d56569 for instance with vm_state deleted and task_state None. Apr 21 14:11:27 user nova-compute[71474]: DEBUG nova.compute.manager [req-671231d5-8d93-40a4-9b97-5e422c01a93f req-fc0ad880-5d69-4a26-96f8-56682085fdee service nova] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] Received event network-vif-deleted-2ae823dd-90e2-45a4-a300-3ea150d56569 {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:11:29 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:11:34 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 14:11:39 user nova-compute[71474]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71474) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 14:11:39 user nova-compute[71474]: INFO nova.compute.manager [-] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] VM Stopped (Lifecycle Event) Apr 21 14:11:39 user nova-compute[71474]: DEBUG nova.compute.manager [None req-851aa8a8-8d65-41de-9fc7-713593d547d5 None None] [instance: 4a52be06-ff23-47fa-8f3f-ecd2e4045df0] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 14:11:39 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 14:11:39 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:11:39 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71474) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 21 14:11:39 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71474) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 14:11:39 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71474) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 14:11:39 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:11:44 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:11:49 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 14:11:54 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:11:59 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:12:00 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:12:01 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:12:02 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:12:02 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:12:04 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:12:04 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Starting heal instance info cache {{(pid=71474) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 21 14:12:04 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Rebuilding the list of instances to heal {{(pid=71474) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 21 14:12:04 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquiring lock "refresh_cache-b5e2e065-1b7d-4cbf-b31a-923ae2f92fff" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 14:12:04 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquired lock "refresh_cache-b5e2e065-1b7d-4cbf-b31a-923ae2f92fff" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 14:12:04 user nova-compute[71474]: DEBUG nova.network.neutron [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Forcefully refreshing network info cache for instance {{(pid=71474) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 21 14:12:04 user nova-compute[71474]: DEBUG nova.objects.instance [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lazy-loading 'info_cache' on Instance uuid b5e2e065-1b7d-4cbf-b31a-923ae2f92fff {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 14:12:04 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 14:12:05 user nova-compute[71474]: DEBUG nova.network.neutron [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Updating instance_info_cache with network_info: [{"id": "7eb11528-a882-4084-a2c7-b36fd432fecf", "address": "fa:16:3e:05:9e:ea", "network": {"id": "d9138a89-3d80-4ef8-b937-1613f614c9e8", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-2095900346-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.101", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "41c39fcb224f4e69a73734be43ba6588", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7eb11528-a8", "ovs_interfaceid": "7eb11528-a882-4084-a2c7-b36fd432fecf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:12:05 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Releasing lock "refresh_cache-b5e2e065-1b7d-4cbf-b31a-923ae2f92fff" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 14:12:05 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Updated the network info_cache for instance {{(pid=71474) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 21 14:12:05 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:12:05 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:12:05 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:12:05 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:12:05 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Auditing locally available compute resources for user (node: user) {{(pid=71474) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 21 14:12:05 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b5e2e065-1b7d-4cbf-b31a-923ae2f92fff/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:12:05 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b5e2e065-1b7d-4cbf-b31a-923ae2f92fff/disk --force-share --output=json" returned: 0 in 0.138s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:12:05 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b5e2e065-1b7d-4cbf-b31a-923ae2f92fff/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:12:05 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b5e2e065-1b7d-4cbf-b31a-923ae2f92fff/disk --force-share --output=json" returned: 0 in 0.140s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:12:06 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 14:12:06 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 14:12:06 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Hypervisor/Node resource view: name=user free_ram=9056MB free_disk=26.109638214111328GB free_vcpus=11 pci_devices=[{"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71474) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 21 14:12:06 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:12:06 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:12:06 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Instance b5e2e065-1b7d-4cbf-b31a-923ae2f92fff actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71474) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 14:12:06 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Total usable vcpus: 12, total allocated vcpus: 1 {{(pid=71474) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 21 14:12:06 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Final resource view: name=user phys_ram=16023MB used_ram=640MB phys_disk=40GB used_disk=1GB total_vcpus=12 used_vcpus=1 pci_stats=[] {{(pid=71474) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 21 14:12:06 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Inventory has not changed in ProviderTree for provider: 4e62c1ab-67bb-43ed-8389-61deb50e98d7 {{(pid=71474) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 14:12:06 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Inventory has not changed for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 14:12:06 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Compute_service record updated for user:user {{(pid=71474) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 21 14:12:06 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.224s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:12:08 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:12:08 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:12:08 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:12:09 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:12:10 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:12:10 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71474) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 21 14:12:14 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:12:15 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:12:19 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:12:19 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:12:24 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 14:12:24 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:12:24 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe {{(pid=71474) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 21 14:12:24 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71474) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 14:12:24 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71474) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 14:12:24 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:12:29 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:12:31 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] Acquiring lock "ff432b87-a8d9-4dfd-9abc-29f93ea545d0" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:12:31 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] Lock "ff432b87-a8d9-4dfd-9abc-29f93ea545d0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:12:31 user nova-compute[71474]: DEBUG nova.compute.manager [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] [instance: ff432b87-a8d9-4dfd-9abc-29f93ea545d0] Starting instance... {{(pid=71474) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 21 14:12:31 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:12:31 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:12:31 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71474) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 21 14:12:31 user nova-compute[71474]: INFO nova.compute.claims [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] [instance: ff432b87-a8d9-4dfd-9abc-29f93ea545d0] Claim successful on node user Apr 21 14:12:31 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] Inventory has not changed in ProviderTree for provider: 4e62c1ab-67bb-43ed-8389-61deb50e98d7 {{(pid=71474) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 14:12:31 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] Inventory has not changed for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 14:12:31 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.202s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:12:31 user nova-compute[71474]: DEBUG nova.compute.manager [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] [instance: ff432b87-a8d9-4dfd-9abc-29f93ea545d0] Start building networks asynchronously for instance. {{(pid=71474) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 21 14:12:31 user nova-compute[71474]: DEBUG nova.compute.manager [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] [instance: ff432b87-a8d9-4dfd-9abc-29f93ea545d0] Allocating IP information in the background. {{(pid=71474) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 21 14:12:31 user nova-compute[71474]: DEBUG nova.network.neutron [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] [instance: ff432b87-a8d9-4dfd-9abc-29f93ea545d0] allocate_for_instance() {{(pid=71474) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 21 14:12:31 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] [instance: ff432b87-a8d9-4dfd-9abc-29f93ea545d0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 21 14:12:31 user nova-compute[71474]: DEBUG nova.compute.manager [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] [instance: ff432b87-a8d9-4dfd-9abc-29f93ea545d0] Start building block device mappings for instance. {{(pid=71474) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 21 14:12:31 user nova-compute[71474]: DEBUG nova.policy [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a1eadfd22cf841d8924dadda7cc29f6a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5c8eefb268d74734820c0faac7ddb131', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71474) authorize /opt/stack/nova/nova/policy.py:203}} Apr 21 14:12:31 user nova-compute[71474]: DEBUG nova.compute.manager [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] [instance: ff432b87-a8d9-4dfd-9abc-29f93ea545d0] Start spawning the instance on the hypervisor. {{(pid=71474) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 21 14:12:31 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] [instance: ff432b87-a8d9-4dfd-9abc-29f93ea545d0] Creating instance directory {{(pid=71474) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 21 14:12:31 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] [instance: ff432b87-a8d9-4dfd-9abc-29f93ea545d0] Creating image(s) Apr 21 14:12:31 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] Acquiring lock "/opt/stack/data/nova/instances/ff432b87-a8d9-4dfd-9abc-29f93ea545d0/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:12:31 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] Lock "/opt/stack/data/nova/instances/ff432b87-a8d9-4dfd-9abc-29f93ea545d0/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:12:31 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] Lock "/opt/stack/data/nova/instances/ff432b87-a8d9-4dfd-9abc-29f93ea545d0/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.006s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:12:31 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:12:31 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json" returned: 0 in 0.126s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:12:31 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] Acquiring lock "8e8c288cb98f22f6af31ad55f38b7baa81c260d7" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:12:31 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] Lock "8e8c288cb98f22f6af31ad55f38b7baa81c260d7" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:12:31 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:12:32 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json" returned: 0 in 0.125s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:12:32 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7,backing_fmt=raw /opt/stack/data/nova/instances/ff432b87-a8d9-4dfd-9abc-29f93ea545d0/disk 1073741824 {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:12:32 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7,backing_fmt=raw /opt/stack/data/nova/instances/ff432b87-a8d9-4dfd-9abc-29f93ea545d0/disk 1073741824" returned: 0 in 0.044s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:12:32 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] Lock "8e8c288cb98f22f6af31ad55f38b7baa81c260d7" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.175s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:12:32 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:12:32 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/8e8c288cb98f22f6af31ad55f38b7baa81c260d7 --force-share --output=json" returned: 0 in 0.156s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:12:32 user nova-compute[71474]: DEBUG nova.virt.disk.api [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] Checking if we can resize image /opt/stack/data/nova/instances/ff432b87-a8d9-4dfd-9abc-29f93ea545d0/disk. size=1073741824 {{(pid=71474) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 21 14:12:32 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ff432b87-a8d9-4dfd-9abc-29f93ea545d0/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:12:32 user nova-compute[71474]: DEBUG nova.network.neutron [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] [instance: ff432b87-a8d9-4dfd-9abc-29f93ea545d0] Successfully created port: 39a38a51-576d-4bf1-a4c1-013343ef291c {{(pid=71474) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 21 14:12:32 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ff432b87-a8d9-4dfd-9abc-29f93ea545d0/disk --force-share --output=json" returned: 0 in 0.138s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:12:32 user nova-compute[71474]: DEBUG nova.virt.disk.api [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] Cannot resize image /opt/stack/data/nova/instances/ff432b87-a8d9-4dfd-9abc-29f93ea545d0/disk to a smaller size. {{(pid=71474) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 21 14:12:32 user nova-compute[71474]: DEBUG nova.objects.instance [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] Lazy-loading 'migration_context' on Instance uuid ff432b87-a8d9-4dfd-9abc-29f93ea545d0 {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 14:12:32 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] [instance: ff432b87-a8d9-4dfd-9abc-29f93ea545d0] Created local disks {{(pid=71474) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 21 14:12:32 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] [instance: ff432b87-a8d9-4dfd-9abc-29f93ea545d0] Ensure instance console log exists: /opt/stack/data/nova/instances/ff432b87-a8d9-4dfd-9abc-29f93ea545d0/console.log {{(pid=71474) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 21 14:12:32 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:12:32 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:12:32 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:12:32 user nova-compute[71474]: DEBUG nova.network.neutron [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] [instance: ff432b87-a8d9-4dfd-9abc-29f93ea545d0] Successfully updated port: 39a38a51-576d-4bf1-a4c1-013343ef291c {{(pid=71474) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 21 14:12:33 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] Acquiring lock "refresh_cache-ff432b87-a8d9-4dfd-9abc-29f93ea545d0" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 14:12:33 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] Acquired lock "refresh_cache-ff432b87-a8d9-4dfd-9abc-29f93ea545d0" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 14:12:33 user nova-compute[71474]: DEBUG nova.network.neutron [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] [instance: ff432b87-a8d9-4dfd-9abc-29f93ea545d0] Building network info cache for instance {{(pid=71474) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 21 14:12:33 user nova-compute[71474]: DEBUG nova.compute.manager [req-9fcda110-6e35-4d10-b945-103cf8f7b179 req-aa87dd10-928f-4332-b820-b37a20c9930c service nova] [instance: ff432b87-a8d9-4dfd-9abc-29f93ea545d0] Received event network-changed-39a38a51-576d-4bf1-a4c1-013343ef291c {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:12:33 user nova-compute[71474]: DEBUG nova.compute.manager [req-9fcda110-6e35-4d10-b945-103cf8f7b179 req-aa87dd10-928f-4332-b820-b37a20c9930c service nova] [instance: ff432b87-a8d9-4dfd-9abc-29f93ea545d0] Refreshing instance network info cache due to event network-changed-39a38a51-576d-4bf1-a4c1-013343ef291c. {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 21 14:12:33 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-9fcda110-6e35-4d10-b945-103cf8f7b179 req-aa87dd10-928f-4332-b820-b37a20c9930c service nova] Acquiring lock "refresh_cache-ff432b87-a8d9-4dfd-9abc-29f93ea545d0" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 14:12:33 user nova-compute[71474]: DEBUG nova.network.neutron [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] [instance: ff432b87-a8d9-4dfd-9abc-29f93ea545d0] Instance cache missing network info. {{(pid=71474) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 21 14:12:33 user nova-compute[71474]: DEBUG nova.network.neutron [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] [instance: ff432b87-a8d9-4dfd-9abc-29f93ea545d0] Updating instance_info_cache with network_info: [{"id": "39a38a51-576d-4bf1-a4c1-013343ef291c", "address": "fa:16:3e:3e:47:7e", "network": {"id": "42ccd7c8-88c9-488d-930f-e97bbf32973d", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1599304130-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "5c8eefb268d74734820c0faac7ddb131", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap39a38a51-57", "ovs_interfaceid": "39a38a51-576d-4bf1-a4c1-013343ef291c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:12:33 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] Releasing lock "refresh_cache-ff432b87-a8d9-4dfd-9abc-29f93ea545d0" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 14:12:33 user nova-compute[71474]: DEBUG nova.compute.manager [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] [instance: ff432b87-a8d9-4dfd-9abc-29f93ea545d0] Instance network_info: |[{"id": "39a38a51-576d-4bf1-a4c1-013343ef291c", "address": "fa:16:3e:3e:47:7e", "network": {"id": "42ccd7c8-88c9-488d-930f-e97bbf32973d", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1599304130-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "5c8eefb268d74734820c0faac7ddb131", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap39a38a51-57", "ovs_interfaceid": "39a38a51-576d-4bf1-a4c1-013343ef291c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71474) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 21 14:12:33 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-9fcda110-6e35-4d10-b945-103cf8f7b179 req-aa87dd10-928f-4332-b820-b37a20c9930c service nova] Acquired lock "refresh_cache-ff432b87-a8d9-4dfd-9abc-29f93ea545d0" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 14:12:33 user nova-compute[71474]: DEBUG nova.network.neutron [req-9fcda110-6e35-4d10-b945-103cf8f7b179 req-aa87dd10-928f-4332-b820-b37a20c9930c service nova] [instance: ff432b87-a8d9-4dfd-9abc-29f93ea545d0] Refreshing network info cache for port 39a38a51-576d-4bf1-a4c1-013343ef291c {{(pid=71474) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 21 14:12:33 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] [instance: ff432b87-a8d9-4dfd-9abc-29f93ea545d0] Start _get_guest_xml network_info=[{"id": "39a38a51-576d-4bf1-a4c1-013343ef291c", "address": "fa:16:3e:3e:47:7e", "network": {"id": "42ccd7c8-88c9-488d-930f-e97bbf32973d", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1599304130-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "5c8eefb268d74734820c0faac7ddb131", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap39a38a51-57", "ovs_interfaceid": "39a38a51-576d-4bf1-a4c1-013343ef291c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T13:54:16Z,direct_url=,disk_format='qcow2',id=2edfef44-2867-4e03-a53e-b139f99afa75,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='36a44032fda748c1965c722304fa176d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T13:54:18Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'guest_format': None, 'encryption_secret_uuid': None, 'encryption_options': None, 'boot_index': 0, 'image_id': '2edfef44-2867-4e03-a53e-b139f99afa75'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71474) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 21 14:12:33 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 14:12:33 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 14:12:33 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71474) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 21 14:12:33 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-21T13:55:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-21T13:54:16Z,direct_url=,disk_format='qcow2',id=2edfef44-2867-4e03-a53e-b139f99afa75,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='36a44032fda748c1965c722304fa176d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-21T13:54:18Z,virtual_size=,visibility=), allow threads: True {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 21 14:12:33 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] Flavor limits 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 21 14:12:33 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] Image limits 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 21 14:12:33 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] Flavor pref 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 21 14:12:33 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] Image pref 0:0:0 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 21 14:12:33 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71474) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 21 14:12:33 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 21 14:12:33 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71474) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 21 14:12:33 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] Got 1 possible topologies {{(pid=71474) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 21 14:12:33 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 21 14:12:33 user nova-compute[71474]: DEBUG nova.virt.hardware [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71474) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 21 14:12:33 user nova-compute[71474]: DEBUG nova.virt.libvirt.vif [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T14:12:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesActionsTest-instance-1372418588',display_name='tempest-VolumesActionsTest-instance-1372418588',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesactionstest-instance-1372418588',id=25,image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5c8eefb268d74734820c0faac7ddb131',ramdisk_id='',reservation_id='r-v10fxrmf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-VolumesActionsTest-1664293238',owner_user_name='tempest-VolumesActionsTest-1664293238-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T14:12:32Z,user_data=None,user_id='a1eadfd22cf841d8924dadda7cc29f6a',uuid=ff432b87-a8d9-4dfd-9abc-29f93ea545d0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "39a38a51-576d-4bf1-a4c1-013343ef291c", "address": "fa:16:3e:3e:47:7e", "network": {"id": "42ccd7c8-88c9-488d-930f-e97bbf32973d", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1599304130-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "5c8eefb268d74734820c0faac7ddb131", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap39a38a51-57", "ovs_interfaceid": "39a38a51-576d-4bf1-a4c1-013343ef291c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71474) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 21 14:12:33 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] Converting VIF {"id": "39a38a51-576d-4bf1-a4c1-013343ef291c", "address": "fa:16:3e:3e:47:7e", "network": {"id": "42ccd7c8-88c9-488d-930f-e97bbf32973d", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1599304130-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "5c8eefb268d74734820c0faac7ddb131", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap39a38a51-57", "ovs_interfaceid": "39a38a51-576d-4bf1-a4c1-013343ef291c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 14:12:33 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3e:47:7e,bridge_name='br-int',has_traffic_filtering=True,id=39a38a51-576d-4bf1-a4c1-013343ef291c,network=Network(42ccd7c8-88c9-488d-930f-e97bbf32973d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap39a38a51-57') {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 14:12:33 user nova-compute[71474]: DEBUG nova.objects.instance [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] Lazy-loading 'pci_devices' on Instance uuid ff432b87-a8d9-4dfd-9abc-29f93ea545d0 {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 14:12:33 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] [instance: ff432b87-a8d9-4dfd-9abc-29f93ea545d0] End _get_guest_xml xml= Apr 21 14:12:33 user nova-compute[71474]: ff432b87-a8d9-4dfd-9abc-29f93ea545d0 Apr 21 14:12:33 user nova-compute[71474]: instance-00000019 Apr 21 14:12:33 user nova-compute[71474]: 131072 Apr 21 14:12:33 user nova-compute[71474]: 1 Apr 21 14:12:33 user nova-compute[71474]: Apr 21 14:12:33 user nova-compute[71474]: Apr 21 14:12:33 user nova-compute[71474]: Apr 21 14:12:33 user nova-compute[71474]: tempest-VolumesActionsTest-instance-1372418588 Apr 21 14:12:33 user nova-compute[71474]: 2023-04-21 14:12:33 Apr 21 14:12:33 user nova-compute[71474]: Apr 21 14:12:33 user nova-compute[71474]: 128 Apr 21 14:12:33 user nova-compute[71474]: 1 Apr 21 14:12:33 user nova-compute[71474]: 0 Apr 21 14:12:33 user nova-compute[71474]: 0 Apr 21 14:12:33 user nova-compute[71474]: 1 Apr 21 14:12:33 user nova-compute[71474]: Apr 21 14:12:33 user nova-compute[71474]: Apr 21 14:12:33 user nova-compute[71474]: tempest-VolumesActionsTest-1664293238-project-member Apr 21 14:12:33 user nova-compute[71474]: tempest-VolumesActionsTest-1664293238 Apr 21 14:12:33 user nova-compute[71474]: Apr 21 14:12:33 user nova-compute[71474]: Apr 21 14:12:33 user nova-compute[71474]: Apr 21 14:12:33 user nova-compute[71474]: Apr 21 14:12:33 user nova-compute[71474]: Apr 21 14:12:33 user nova-compute[71474]: Apr 21 14:12:33 user nova-compute[71474]: Apr 21 14:12:33 user nova-compute[71474]: Apr 21 14:12:33 user nova-compute[71474]: Apr 21 14:12:33 user nova-compute[71474]: Apr 21 14:12:33 user nova-compute[71474]: Apr 21 14:12:33 user nova-compute[71474]: OpenStack Foundation Apr 21 14:12:33 user nova-compute[71474]: OpenStack Nova Apr 21 14:12:33 user nova-compute[71474]: 0.0.0 Apr 21 14:12:33 user nova-compute[71474]: ff432b87-a8d9-4dfd-9abc-29f93ea545d0 Apr 21 14:12:33 user nova-compute[71474]: ff432b87-a8d9-4dfd-9abc-29f93ea545d0 Apr 21 14:12:33 user nova-compute[71474]: Virtual Machine Apr 21 14:12:33 user nova-compute[71474]: Apr 21 14:12:33 user nova-compute[71474]: Apr 21 14:12:33 user nova-compute[71474]: Apr 21 14:12:33 user nova-compute[71474]: hvm Apr 21 14:12:33 user nova-compute[71474]: Apr 21 14:12:33 user nova-compute[71474]: Apr 21 14:12:33 user nova-compute[71474]: Apr 21 14:12:33 user nova-compute[71474]: Apr 21 14:12:33 user nova-compute[71474]: Apr 21 14:12:33 user nova-compute[71474]: Apr 21 14:12:33 user nova-compute[71474]: Apr 21 14:12:33 user nova-compute[71474]: Apr 21 14:12:33 user nova-compute[71474]: Apr 21 14:12:33 user nova-compute[71474]: Apr 21 14:12:33 user nova-compute[71474]: Apr 21 14:12:33 user nova-compute[71474]: Apr 21 14:12:33 user nova-compute[71474]: Apr 21 14:12:33 user nova-compute[71474]: Apr 21 14:12:33 user nova-compute[71474]: Nehalem Apr 21 14:12:33 user nova-compute[71474]: Apr 21 14:12:33 user nova-compute[71474]: Apr 21 14:12:33 user nova-compute[71474]: Apr 21 14:12:33 user nova-compute[71474]: Apr 21 14:12:33 user nova-compute[71474]: Apr 21 14:12:33 user nova-compute[71474]: Apr 21 14:12:33 user nova-compute[71474]: Apr 21 14:12:33 user nova-compute[71474]: Apr 21 14:12:33 user nova-compute[71474]: Apr 21 14:12:33 user nova-compute[71474]: Apr 21 14:12:33 user nova-compute[71474]: Apr 21 14:12:33 user nova-compute[71474]: Apr 21 14:12:33 user nova-compute[71474]: Apr 21 14:12:33 user nova-compute[71474]: Apr 21 14:12:33 user nova-compute[71474]: Apr 21 14:12:33 user nova-compute[71474]: Apr 21 14:12:33 user nova-compute[71474]: Apr 21 14:12:33 user nova-compute[71474]: Apr 21 14:12:33 user nova-compute[71474]: Apr 21 14:12:33 user nova-compute[71474]: Apr 21 14:12:33 user nova-compute[71474]: /dev/urandom Apr 21 14:12:33 user nova-compute[71474]: Apr 21 14:12:33 user nova-compute[71474]: Apr 21 14:12:33 user nova-compute[71474]: Apr 21 14:12:33 user nova-compute[71474]: Apr 21 14:12:33 user nova-compute[71474]: Apr 21 14:12:33 user nova-compute[71474]: Apr 21 14:12:33 user nova-compute[71474]: Apr 21 14:12:33 user nova-compute[71474]: {{(pid=71474) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 21 14:12:33 user nova-compute[71474]: DEBUG nova.virt.libvirt.vif [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T14:12:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesActionsTest-instance-1372418588',display_name='tempest-VolumesActionsTest-instance-1372418588',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesactionstest-instance-1372418588',id=25,image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5c8eefb268d74734820c0faac7ddb131',ramdisk_id='',reservation_id='r-v10fxrmf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-VolumesActionsTest-1664293238',owner_user_name='tempest-VolumesActionsTest-1664293238-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-21T14:12:32Z,user_data=None,user_id='a1eadfd22cf841d8924dadda7cc29f6a',uuid=ff432b87-a8d9-4dfd-9abc-29f93ea545d0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "39a38a51-576d-4bf1-a4c1-013343ef291c", "address": "fa:16:3e:3e:47:7e", "network": {"id": "42ccd7c8-88c9-488d-930f-e97bbf32973d", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1599304130-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "5c8eefb268d74734820c0faac7ddb131", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap39a38a51-57", "ovs_interfaceid": "39a38a51-576d-4bf1-a4c1-013343ef291c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 21 14:12:33 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] Converting VIF {"id": "39a38a51-576d-4bf1-a4c1-013343ef291c", "address": "fa:16:3e:3e:47:7e", "network": {"id": "42ccd7c8-88c9-488d-930f-e97bbf32973d", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1599304130-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "5c8eefb268d74734820c0faac7ddb131", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap39a38a51-57", "ovs_interfaceid": "39a38a51-576d-4bf1-a4c1-013343ef291c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 14:12:33 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3e:47:7e,bridge_name='br-int',has_traffic_filtering=True,id=39a38a51-576d-4bf1-a4c1-013343ef291c,network=Network(42ccd7c8-88c9-488d-930f-e97bbf32973d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap39a38a51-57') {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 14:12:33 user nova-compute[71474]: DEBUG os_vif [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3e:47:7e,bridge_name='br-int',has_traffic_filtering=True,id=39a38a51-576d-4bf1-a4c1-013343ef291c,network=Network(42ccd7c8-88c9-488d-930f-e97bbf32973d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap39a38a51-57') {{(pid=71474) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 21 14:12:33 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:12:33 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 14:12:33 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 21 14:12:33 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:12:33 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap39a38a51-57, may_exist=True) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 14:12:33 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap39a38a51-57, col_values=(('external_ids', {'iface-id': '39a38a51-576d-4bf1-a4c1-013343ef291c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:3e:47:7e', 'vm-uuid': 'ff432b87-a8d9-4dfd-9abc-29f93ea545d0'}),)) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 14:12:33 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:12:33 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 14:12:33 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:12:33 user nova-compute[71474]: INFO os_vif [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3e:47:7e,bridge_name='br-int',has_traffic_filtering=True,id=39a38a51-576d-4bf1-a4c1-013343ef291c,network=Network(42ccd7c8-88c9-488d-930f-e97bbf32973d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap39a38a51-57') Apr 21 14:12:33 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] No BDM found with device name vda, not building metadata. {{(pid=71474) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 21 14:12:33 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] No VIF found with MAC fa:16:3e:3e:47:7e, not building metadata {{(pid=71474) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 21 14:12:34 user nova-compute[71474]: DEBUG nova.network.neutron [req-9fcda110-6e35-4d10-b945-103cf8f7b179 req-aa87dd10-928f-4332-b820-b37a20c9930c service nova] [instance: ff432b87-a8d9-4dfd-9abc-29f93ea545d0] Updated VIF entry in instance network info cache for port 39a38a51-576d-4bf1-a4c1-013343ef291c. {{(pid=71474) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 21 14:12:34 user nova-compute[71474]: DEBUG nova.network.neutron [req-9fcda110-6e35-4d10-b945-103cf8f7b179 req-aa87dd10-928f-4332-b820-b37a20c9930c service nova] [instance: ff432b87-a8d9-4dfd-9abc-29f93ea545d0] Updating instance_info_cache with network_info: [{"id": "39a38a51-576d-4bf1-a4c1-013343ef291c", "address": "fa:16:3e:3e:47:7e", "network": {"id": "42ccd7c8-88c9-488d-930f-e97bbf32973d", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1599304130-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "5c8eefb268d74734820c0faac7ddb131", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap39a38a51-57", "ovs_interfaceid": "39a38a51-576d-4bf1-a4c1-013343ef291c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:12:34 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-9fcda110-6e35-4d10-b945-103cf8f7b179 req-aa87dd10-928f-4332-b820-b37a20c9930c service nova] Releasing lock "refresh_cache-ff432b87-a8d9-4dfd-9abc-29f93ea545d0" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 14:12:35 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:12:35 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:12:35 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:12:35 user nova-compute[71474]: DEBUG nova.compute.manager [req-bcff33ec-dc9d-45b9-abad-3a8d1b5c5663 req-241347af-0aa4-4546-925e-623571df598c service nova] [instance: ff432b87-a8d9-4dfd-9abc-29f93ea545d0] Received event network-vif-plugged-39a38a51-576d-4bf1-a4c1-013343ef291c {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:12:35 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-bcff33ec-dc9d-45b9-abad-3a8d1b5c5663 req-241347af-0aa4-4546-925e-623571df598c service nova] Acquiring lock "ff432b87-a8d9-4dfd-9abc-29f93ea545d0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:12:35 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-bcff33ec-dc9d-45b9-abad-3a8d1b5c5663 req-241347af-0aa4-4546-925e-623571df598c service nova] Lock "ff432b87-a8d9-4dfd-9abc-29f93ea545d0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:12:35 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-bcff33ec-dc9d-45b9-abad-3a8d1b5c5663 req-241347af-0aa4-4546-925e-623571df598c service nova] Lock "ff432b87-a8d9-4dfd-9abc-29f93ea545d0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:12:35 user nova-compute[71474]: DEBUG nova.compute.manager [req-bcff33ec-dc9d-45b9-abad-3a8d1b5c5663 req-241347af-0aa4-4546-925e-623571df598c service nova] [instance: ff432b87-a8d9-4dfd-9abc-29f93ea545d0] No waiting events found dispatching network-vif-plugged-39a38a51-576d-4bf1-a4c1-013343ef291c {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:12:35 user nova-compute[71474]: WARNING nova.compute.manager [req-bcff33ec-dc9d-45b9-abad-3a8d1b5c5663 req-241347af-0aa4-4546-925e-623571df598c service nova] [instance: ff432b87-a8d9-4dfd-9abc-29f93ea545d0] Received unexpected event network-vif-plugged-39a38a51-576d-4bf1-a4c1-013343ef291c for instance with vm_state building and task_state spawning. Apr 21 14:12:36 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:12:37 user nova-compute[71474]: DEBUG nova.virt.driver [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Emitting event Resumed> {{(pid=71474) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 14:12:37 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: ff432b87-a8d9-4dfd-9abc-29f93ea545d0] VM Resumed (Lifecycle Event) Apr 21 14:12:37 user nova-compute[71474]: DEBUG nova.compute.manager [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] [instance: ff432b87-a8d9-4dfd-9abc-29f93ea545d0] Instance event wait completed in 0 seconds for {{(pid=71474) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 21 14:12:37 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] [instance: ff432b87-a8d9-4dfd-9abc-29f93ea545d0] Guest created on hypervisor {{(pid=71474) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 21 14:12:37 user nova-compute[71474]: INFO nova.virt.libvirt.driver [-] [instance: ff432b87-a8d9-4dfd-9abc-29f93ea545d0] Instance spawned successfully. Apr 21 14:12:37 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] [instance: ff432b87-a8d9-4dfd-9abc-29f93ea545d0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 21 14:12:37 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: ff432b87-a8d9-4dfd-9abc-29f93ea545d0] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 14:12:37 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] [instance: ff432b87-a8d9-4dfd-9abc-29f93ea545d0] Found default for hw_cdrom_bus of ide {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:12:37 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] [instance: ff432b87-a8d9-4dfd-9abc-29f93ea545d0] Found default for hw_disk_bus of virtio {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:12:37 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] [instance: ff432b87-a8d9-4dfd-9abc-29f93ea545d0] Found default for hw_input_bus of None {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:12:37 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] [instance: ff432b87-a8d9-4dfd-9abc-29f93ea545d0] Found default for hw_pointer_model of None {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:12:37 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] [instance: ff432b87-a8d9-4dfd-9abc-29f93ea545d0] Found default for hw_video_model of virtio {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:12:37 user nova-compute[71474]: DEBUG nova.virt.libvirt.driver [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] [instance: ff432b87-a8d9-4dfd-9abc-29f93ea545d0] Found default for hw_vif_model of virtio {{(pid=71474) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 21 14:12:37 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: ff432b87-a8d9-4dfd-9abc-29f93ea545d0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71474) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 14:12:37 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: ff432b87-a8d9-4dfd-9abc-29f93ea545d0] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 14:12:37 user nova-compute[71474]: DEBUG nova.virt.driver [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] Emitting event Started> {{(pid=71474) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 14:12:37 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: ff432b87-a8d9-4dfd-9abc-29f93ea545d0] VM Started (Lifecycle Event) Apr 21 14:12:37 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: ff432b87-a8d9-4dfd-9abc-29f93ea545d0] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 14:12:37 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: ff432b87-a8d9-4dfd-9abc-29f93ea545d0] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71474) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 21 14:12:37 user nova-compute[71474]: INFO nova.compute.manager [None req-9f75c7c9-87e4-4bd0-ac0c-ff6eada78b82 None None] [instance: ff432b87-a8d9-4dfd-9abc-29f93ea545d0] During sync_power_state the instance has a pending task (spawning). Skip. Apr 21 14:12:37 user nova-compute[71474]: INFO nova.compute.manager [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] [instance: ff432b87-a8d9-4dfd-9abc-29f93ea545d0] Took 5.51 seconds to spawn the instance on the hypervisor. Apr 21 14:12:37 user nova-compute[71474]: DEBUG nova.compute.manager [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] [instance: ff432b87-a8d9-4dfd-9abc-29f93ea545d0] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 14:12:37 user nova-compute[71474]: DEBUG nova.compute.manager [req-955c43e1-83e2-422d-ae19-60f84ddfe59a req-f886bede-3951-4266-9201-725e1b9c3697 service nova] [instance: ff432b87-a8d9-4dfd-9abc-29f93ea545d0] Received event network-vif-plugged-39a38a51-576d-4bf1-a4c1-013343ef291c {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:12:37 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-955c43e1-83e2-422d-ae19-60f84ddfe59a req-f886bede-3951-4266-9201-725e1b9c3697 service nova] Acquiring lock "ff432b87-a8d9-4dfd-9abc-29f93ea545d0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:12:37 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-955c43e1-83e2-422d-ae19-60f84ddfe59a req-f886bede-3951-4266-9201-725e1b9c3697 service nova] Lock "ff432b87-a8d9-4dfd-9abc-29f93ea545d0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:12:37 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-955c43e1-83e2-422d-ae19-60f84ddfe59a req-f886bede-3951-4266-9201-725e1b9c3697 service nova] Lock "ff432b87-a8d9-4dfd-9abc-29f93ea545d0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:12:37 user nova-compute[71474]: DEBUG nova.compute.manager [req-955c43e1-83e2-422d-ae19-60f84ddfe59a req-f886bede-3951-4266-9201-725e1b9c3697 service nova] [instance: ff432b87-a8d9-4dfd-9abc-29f93ea545d0] No waiting events found dispatching network-vif-plugged-39a38a51-576d-4bf1-a4c1-013343ef291c {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:12:37 user nova-compute[71474]: WARNING nova.compute.manager [req-955c43e1-83e2-422d-ae19-60f84ddfe59a req-f886bede-3951-4266-9201-725e1b9c3697 service nova] [instance: ff432b87-a8d9-4dfd-9abc-29f93ea545d0] Received unexpected event network-vif-plugged-39a38a51-576d-4bf1-a4c1-013343ef291c for instance with vm_state building and task_state spawning. Apr 21 14:12:37 user nova-compute[71474]: INFO nova.compute.manager [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] [instance: ff432b87-a8d9-4dfd-9abc-29f93ea545d0] Took 6.04 seconds to build instance. Apr 21 14:12:37 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-91c11184-d9ce-4614-949a-88a288ff4a7f tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] Lock "ff432b87-a8d9-4dfd-9abc-29f93ea545d0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.131s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:12:38 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:12:43 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:12:48 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 14:12:48 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:12:48 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71474) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 21 14:12:48 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71474) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 14:12:48 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71474) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 14:12:48 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:12:51 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:12:53 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:12:56 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:12:58 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:13:01 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:13:02 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:13:02 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:13:03 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 14:13:03 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:13:03 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71474) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 21 14:13:03 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71474) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 14:13:03 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71474) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 14:13:03 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:13:05 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:13:05 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Starting heal instance info cache {{(pid=71474) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 21 14:13:05 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Rebuilding the list of instances to heal {{(pid=71474) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 21 14:13:05 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquiring lock "refresh_cache-b5e2e065-1b7d-4cbf-b31a-923ae2f92fff" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 14:13:05 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquired lock "refresh_cache-b5e2e065-1b7d-4cbf-b31a-923ae2f92fff" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 14:13:05 user nova-compute[71474]: DEBUG nova.network.neutron [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Forcefully refreshing network info cache for instance {{(pid=71474) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 21 14:13:05 user nova-compute[71474]: DEBUG nova.objects.instance [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lazy-loading 'info_cache' on Instance uuid b5e2e065-1b7d-4cbf-b31a-923ae2f92fff {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 14:13:06 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:13:06 user nova-compute[71474]: DEBUG nova.network.neutron [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Updating instance_info_cache with network_info: [{"id": "7eb11528-a882-4084-a2c7-b36fd432fecf", "address": "fa:16:3e:05:9e:ea", "network": {"id": "d9138a89-3d80-4ef8-b937-1613f614c9e8", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-2095900346-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.101", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "41c39fcb224f4e69a73734be43ba6588", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7eb11528-a8", "ovs_interfaceid": "7eb11528-a882-4084-a2c7-b36fd432fecf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:13:06 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Releasing lock "refresh_cache-b5e2e065-1b7d-4cbf-b31a-923ae2f92fff" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 14:13:06 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Updated the network info_cache for instance {{(pid=71474) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 21 14:13:06 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:13:06 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:13:06 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:13:06 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:13:06 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Auditing locally available compute resources for user (node: user) {{(pid=71474) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 21 14:13:06 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ff432b87-a8d9-4dfd-9abc-29f93ea545d0/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:13:07 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ff432b87-a8d9-4dfd-9abc-29f93ea545d0/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:13:07 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ff432b87-a8d9-4dfd-9abc-29f93ea545d0/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:13:07 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ff432b87-a8d9-4dfd-9abc-29f93ea545d0/disk --force-share --output=json" returned: 0 in 0.134s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:13:07 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b5e2e065-1b7d-4cbf-b31a-923ae2f92fff/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:13:07 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b5e2e065-1b7d-4cbf-b31a-923ae2f92fff/disk --force-share --output=json" returned: 0 in 0.139s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:13:07 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b5e2e065-1b7d-4cbf-b31a-923ae2f92fff/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:13:07 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b5e2e065-1b7d-4cbf-b31a-923ae2f92fff/disk --force-share --output=json" returned: 0 in 0.140s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:13:07 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 14:13:07 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 14:13:07 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Hypervisor/Node resource view: name=user free_ram=8970MB free_disk=26.08829116821289GB free_vcpus=10 pci_devices=[{"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71474) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 21 14:13:07 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:13:07 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:13:08 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Instance b5e2e065-1b7d-4cbf-b31a-923ae2f92fff actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71474) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 14:13:08 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Instance ff432b87-a8d9-4dfd-9abc-29f93ea545d0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71474) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 14:13:08 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Total usable vcpus: 12, total allocated vcpus: 2 {{(pid=71474) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 21 14:13:08 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Final resource view: name=user phys_ram=16023MB used_ram=768MB phys_disk=40GB used_disk=2GB total_vcpus=12 used_vcpus=2 pci_stats=[] {{(pid=71474) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 21 14:13:08 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Inventory has not changed in ProviderTree for provider: 4e62c1ab-67bb-43ed-8389-61deb50e98d7 {{(pid=71474) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 14:13:08 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Inventory has not changed for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 14:13:08 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Compute_service record updated for user:user {{(pid=71474) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 21 14:13:08 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.223s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:13:08 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:13:09 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:13:09 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:13:09 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:13:10 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:13:10 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71474) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 21 14:13:13 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 14:13:13 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:13:13 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71474) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 21 14:13:13 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71474) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 14:13:13 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71474) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 14:13:13 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:13:18 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:13:21 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:13:23 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:13:28 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 14:13:33 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 14:13:38 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 14:13:39 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-56d9b06c-d4f4-4483-b28f-4377f855e728 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] Acquiring lock "b5e2e065-1b7d-4cbf-b31a-923ae2f92fff" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:13:39 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-56d9b06c-d4f4-4483-b28f-4377f855e728 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] Lock "b5e2e065-1b7d-4cbf-b31a-923ae2f92fff" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:13:39 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-56d9b06c-d4f4-4483-b28f-4377f855e728 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] Acquiring lock "b5e2e065-1b7d-4cbf-b31a-923ae2f92fff-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:13:39 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-56d9b06c-d4f4-4483-b28f-4377f855e728 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] Lock "b5e2e065-1b7d-4cbf-b31a-923ae2f92fff-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:13:39 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-56d9b06c-d4f4-4483-b28f-4377f855e728 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] Lock "b5e2e065-1b7d-4cbf-b31a-923ae2f92fff-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:13:39 user nova-compute[71474]: INFO nova.compute.manager [None req-56d9b06c-d4f4-4483-b28f-4377f855e728 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Terminating instance Apr 21 14:13:39 user nova-compute[71474]: DEBUG nova.compute.manager [None req-56d9b06c-d4f4-4483-b28f-4377f855e728 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Start destroying the instance on the hypervisor. {{(pid=71474) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 21 14:13:39 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:13:39 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:13:39 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:13:39 user nova-compute[71474]: DEBUG nova.compute.manager [req-38799436-2dce-4605-aec0-6f42f10ab668 req-2a605863-7fc8-4d93-83bf-1626ab8baa84 service nova] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Received event network-vif-unplugged-7eb11528-a882-4084-a2c7-b36fd432fecf {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:13:39 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-38799436-2dce-4605-aec0-6f42f10ab668 req-2a605863-7fc8-4d93-83bf-1626ab8baa84 service nova] Acquiring lock "b5e2e065-1b7d-4cbf-b31a-923ae2f92fff-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:13:39 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-38799436-2dce-4605-aec0-6f42f10ab668 req-2a605863-7fc8-4d93-83bf-1626ab8baa84 service nova] Lock "b5e2e065-1b7d-4cbf-b31a-923ae2f92fff-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:13:39 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-38799436-2dce-4605-aec0-6f42f10ab668 req-2a605863-7fc8-4d93-83bf-1626ab8baa84 service nova] Lock "b5e2e065-1b7d-4cbf-b31a-923ae2f92fff-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:13:39 user nova-compute[71474]: DEBUG nova.compute.manager [req-38799436-2dce-4605-aec0-6f42f10ab668 req-2a605863-7fc8-4d93-83bf-1626ab8baa84 service nova] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] No waiting events found dispatching network-vif-unplugged-7eb11528-a882-4084-a2c7-b36fd432fecf {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:13:39 user nova-compute[71474]: DEBUG nova.compute.manager [req-38799436-2dce-4605-aec0-6f42f10ab668 req-2a605863-7fc8-4d93-83bf-1626ab8baa84 service nova] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Received event network-vif-unplugged-7eb11528-a882-4084-a2c7-b36fd432fecf for instance with task_state deleting. {{(pid=71474) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 21 14:13:40 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:13:40 user nova-compute[71474]: INFO nova.virt.libvirt.driver [-] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Instance destroyed successfully. Apr 21 14:13:40 user nova-compute[71474]: DEBUG nova.objects.instance [None req-56d9b06c-d4f4-4483-b28f-4377f855e728 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] Lazy-loading 'resources' on Instance uuid b5e2e065-1b7d-4cbf-b31a-923ae2f92fff {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 14:13:40 user nova-compute[71474]: DEBUG nova.virt.libvirt.vif [None req-56d9b06c-d4f4-4483-b28f-4377f855e728 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T14:04:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-564594493',display_name='tempest-ServerActionsTestJSON-server-564594493',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serveractionstestjson-server-564594493',id=20,image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOBtS2vU2hjTdBp9+5GXmMtOFB68EC/JBp4srwATzJ0qLZ+BQocGkSEAP1z1S9M9P1kEv0Vd6quAa1O8JdG4KfvkaPJsmlaSpX/6CyeVnURB0GwGeWV66UBeLbeyknARPA==',key_name='tempest-keypair-2085847639',keypairs=,launch_index=0,launched_at=2023-04-21T14:04:54Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='41c39fcb224f4e69a73734be43ba6588',ramdisk_id='',reservation_id='r-62aqkdvj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServerActionsTestJSON-2051074452',owner_user_name='tempest-ServerActionsTestJSON-2051074452-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-21T14:04:55Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='95f0f10528294c9bb3d4f58f3361c358',uuid=b5e2e065-1b7d-4cbf-b31a-923ae2f92fff,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7eb11528-a882-4084-a2c7-b36fd432fecf", "address": "fa:16:3e:05:9e:ea", "network": {"id": "d9138a89-3d80-4ef8-b937-1613f614c9e8", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-2095900346-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.101", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "41c39fcb224f4e69a73734be43ba6588", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7eb11528-a8", "ovs_interfaceid": "7eb11528-a882-4084-a2c7-b36fd432fecf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 21 14:13:40 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-56d9b06c-d4f4-4483-b28f-4377f855e728 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] Converting VIF {"id": "7eb11528-a882-4084-a2c7-b36fd432fecf", "address": "fa:16:3e:05:9e:ea", "network": {"id": "d9138a89-3d80-4ef8-b937-1613f614c9e8", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-2095900346-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.101", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "41c39fcb224f4e69a73734be43ba6588", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7eb11528-a8", "ovs_interfaceid": "7eb11528-a882-4084-a2c7-b36fd432fecf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 14:13:40 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-56d9b06c-d4f4-4483-b28f-4377f855e728 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:05:9e:ea,bridge_name='br-int',has_traffic_filtering=True,id=7eb11528-a882-4084-a2c7-b36fd432fecf,network=Network(d9138a89-3d80-4ef8-b937-1613f614c9e8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7eb11528-a8') {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 14:13:40 user nova-compute[71474]: DEBUG os_vif [None req-56d9b06c-d4f4-4483-b28f-4377f855e728 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:05:9e:ea,bridge_name='br-int',has_traffic_filtering=True,id=7eb11528-a882-4084-a2c7-b36fd432fecf,network=Network(d9138a89-3d80-4ef8-b937-1613f614c9e8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7eb11528-a8') {{(pid=71474) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 21 14:13:40 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:13:40 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7eb11528-a8, bridge=br-int, if_exists=True) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 14:13:40 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:13:40 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 14:13:40 user nova-compute[71474]: INFO os_vif [None req-56d9b06c-d4f4-4483-b28f-4377f855e728 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:05:9e:ea,bridge_name='br-int',has_traffic_filtering=True,id=7eb11528-a882-4084-a2c7-b36fd432fecf,network=Network(d9138a89-3d80-4ef8-b937-1613f614c9e8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7eb11528-a8') Apr 21 14:13:40 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-56d9b06c-d4f4-4483-b28f-4377f855e728 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Deleting instance files /opt/stack/data/nova/instances/b5e2e065-1b7d-4cbf-b31a-923ae2f92fff_del Apr 21 14:13:40 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-56d9b06c-d4f4-4483-b28f-4377f855e728 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Deletion of /opt/stack/data/nova/instances/b5e2e065-1b7d-4cbf-b31a-923ae2f92fff_del complete Apr 21 14:13:40 user nova-compute[71474]: INFO nova.compute.manager [None req-56d9b06c-d4f4-4483-b28f-4377f855e728 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Took 0.84 seconds to destroy the instance on the hypervisor. Apr 21 14:13:40 user nova-compute[71474]: DEBUG oslo.service.loopingcall [None req-56d9b06c-d4f4-4483-b28f-4377f855e728 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71474) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 21 14:13:40 user nova-compute[71474]: DEBUG nova.compute.manager [-] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Deallocating network for instance {{(pid=71474) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 21 14:13:40 user nova-compute[71474]: DEBUG nova.network.neutron [-] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] deallocate_for_instance() {{(pid=71474) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 21 14:13:41 user nova-compute[71474]: DEBUG nova.network.neutron [-] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Updating instance_info_cache with network_info: [] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:13:41 user nova-compute[71474]: INFO nova.compute.manager [-] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Took 0.96 seconds to deallocate network for instance. Apr 21 14:13:41 user nova-compute[71474]: DEBUG nova.compute.manager [req-14566ffa-e1ba-4d97-96ff-5cb345d2c40e req-712349d5-dab3-4479-948d-a18beef0fefd service nova] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Received event network-vif-deleted-7eb11528-a882-4084-a2c7-b36fd432fecf {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:13:41 user nova-compute[71474]: INFO nova.compute.manager [req-14566ffa-e1ba-4d97-96ff-5cb345d2c40e req-712349d5-dab3-4479-948d-a18beef0fefd service nova] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Neutron deleted interface 7eb11528-a882-4084-a2c7-b36fd432fecf; detaching it from the instance and deleting it from the info cache Apr 21 14:13:41 user nova-compute[71474]: DEBUG nova.network.neutron [req-14566ffa-e1ba-4d97-96ff-5cb345d2c40e req-712349d5-dab3-4479-948d-a18beef0fefd service nova] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Updating instance_info_cache with network_info: [] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:13:41 user nova-compute[71474]: DEBUG nova.compute.manager [req-14566ffa-e1ba-4d97-96ff-5cb345d2c40e req-712349d5-dab3-4479-948d-a18beef0fefd service nova] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Detach interface failed, port_id=7eb11528-a882-4084-a2c7-b36fd432fecf, reason: Instance b5e2e065-1b7d-4cbf-b31a-923ae2f92fff could not be found. {{(pid=71474) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:10816}} Apr 21 14:13:41 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-56d9b06c-d4f4-4483-b28f-4377f855e728 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:13:41 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-56d9b06c-d4f4-4483-b28f-4377f855e728 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:13:41 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-56d9b06c-d4f4-4483-b28f-4377f855e728 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] Inventory has not changed in ProviderTree for provider: 4e62c1ab-67bb-43ed-8389-61deb50e98d7 {{(pid=71474) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 14:13:41 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-56d9b06c-d4f4-4483-b28f-4377f855e728 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] Inventory has not changed for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 14:13:41 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-56d9b06c-d4f4-4483-b28f-4377f855e728 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.134s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:13:41 user nova-compute[71474]: INFO nova.scheduler.client.report [None req-56d9b06c-d4f4-4483-b28f-4377f855e728 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] Deleted allocations for instance b5e2e065-1b7d-4cbf-b31a-923ae2f92fff Apr 21 14:13:41 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-56d9b06c-d4f4-4483-b28f-4377f855e728 tempest-ServerActionsTestJSON-2051074452 tempest-ServerActionsTestJSON-2051074452-project-member] Lock "b5e2e065-1b7d-4cbf-b31a-923ae2f92fff" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.119s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:13:41 user nova-compute[71474]: DEBUG nova.compute.manager [req-7dc6500a-5c66-4c2e-8222-6c5006420673 req-3e37fdc9-086b-4780-8a0d-6b65485058a1 service nova] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Received event network-vif-plugged-7eb11528-a882-4084-a2c7-b36fd432fecf {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:13:41 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-7dc6500a-5c66-4c2e-8222-6c5006420673 req-3e37fdc9-086b-4780-8a0d-6b65485058a1 service nova] Acquiring lock "b5e2e065-1b7d-4cbf-b31a-923ae2f92fff-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:13:41 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-7dc6500a-5c66-4c2e-8222-6c5006420673 req-3e37fdc9-086b-4780-8a0d-6b65485058a1 service nova] Lock "b5e2e065-1b7d-4cbf-b31a-923ae2f92fff-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:13:41 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-7dc6500a-5c66-4c2e-8222-6c5006420673 req-3e37fdc9-086b-4780-8a0d-6b65485058a1 service nova] Lock "b5e2e065-1b7d-4cbf-b31a-923ae2f92fff-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:13:41 user nova-compute[71474]: DEBUG nova.compute.manager [req-7dc6500a-5c66-4c2e-8222-6c5006420673 req-3e37fdc9-086b-4780-8a0d-6b65485058a1 service nova] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] No waiting events found dispatching network-vif-plugged-7eb11528-a882-4084-a2c7-b36fd432fecf {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:13:41 user nova-compute[71474]: WARNING nova.compute.manager [req-7dc6500a-5c66-4c2e-8222-6c5006420673 req-3e37fdc9-086b-4780-8a0d-6b65485058a1 service nova] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Received unexpected event network-vif-plugged-7eb11528-a882-4084-a2c7-b36fd432fecf for instance with vm_state deleted and task_state None. Apr 21 14:13:45 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:13:50 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 14:13:51 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:13:55 user nova-compute[71474]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71474) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 14:13:55 user nova-compute[71474]: INFO nova.compute.manager [-] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] VM Stopped (Lifecycle Event) Apr 21 14:13:55 user nova-compute[71474]: DEBUG nova.compute.manager [None req-65db96f7-297a-41ce-9a12-66d919a5f05e None None] [instance: b5e2e065-1b7d-4cbf-b31a-923ae2f92fff] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 14:13:55 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:14:00 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:14:01 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:14:03 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:14:04 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:14:04 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:14:05 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 14:14:05 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:14:05 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71474) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 21 14:14:05 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71474) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 14:14:05 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71474) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 14:14:05 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:14:05 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:14:05 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Cleaning up deleted instances with incomplete migration {{(pid=71474) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11117}} Apr 21 14:14:06 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:14:06 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Starting heal instance info cache {{(pid=71474) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 21 14:14:06 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquiring lock "refresh_cache-ff432b87-a8d9-4dfd-9abc-29f93ea545d0" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 21 14:14:06 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquired lock "refresh_cache-ff432b87-a8d9-4dfd-9abc-29f93ea545d0" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 21 14:14:06 user nova-compute[71474]: DEBUG nova.network.neutron [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] [instance: ff432b87-a8d9-4dfd-9abc-29f93ea545d0] Forcefully refreshing network info cache for instance {{(pid=71474) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 21 14:14:07 user nova-compute[71474]: DEBUG nova.network.neutron [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] [instance: ff432b87-a8d9-4dfd-9abc-29f93ea545d0] Updating instance_info_cache with network_info: [{"id": "39a38a51-576d-4bf1-a4c1-013343ef291c", "address": "fa:16:3e:3e:47:7e", "network": {"id": "42ccd7c8-88c9-488d-930f-e97bbf32973d", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1599304130-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "5c8eefb268d74734820c0faac7ddb131", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap39a38a51-57", "ovs_interfaceid": "39a38a51-576d-4bf1-a4c1-013343ef291c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:14:07 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Releasing lock "refresh_cache-ff432b87-a8d9-4dfd-9abc-29f93ea545d0" {{(pid=71474) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 21 14:14:07 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] [instance: ff432b87-a8d9-4dfd-9abc-29f93ea545d0] Updated the network info_cache for instance {{(pid=71474) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 21 14:14:07 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:14:07 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:14:07 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:14:07 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:14:07 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Auditing locally available compute resources for user (node: user) {{(pid=71474) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 21 14:14:07 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ff432b87-a8d9-4dfd-9abc-29f93ea545d0/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:14:07 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ff432b87-a8d9-4dfd-9abc-29f93ea545d0/disk --force-share --output=json" returned: 0 in 0.139s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:14:07 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ff432b87-a8d9-4dfd-9abc-29f93ea545d0/disk --force-share --output=json {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 21 14:14:07 user nova-compute[71474]: DEBUG oslo_concurrency.processutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ff432b87-a8d9-4dfd-9abc-29f93ea545d0/disk --force-share --output=json" returned: 0 in 0.132s {{(pid=71474) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 21 14:14:08 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 14:14:08 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 14:14:08 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Hypervisor/Node resource view: name=user free_ram=9046MB free_disk=26.107166290283203GB free_vcpus=11 pci_devices=[{"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71474) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 21 14:14:08 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:14:08 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:14:08 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Instance ff432b87-a8d9-4dfd-9abc-29f93ea545d0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71474) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 21 14:14:08 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Total usable vcpus: 12, total allocated vcpus: 1 {{(pid=71474) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 21 14:14:08 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Final resource view: name=user phys_ram=16023MB used_ram=640MB phys_disk=40GB used_disk=1GB total_vcpus=12 used_vcpus=1 pci_stats=[] {{(pid=71474) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 21 14:14:08 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Refreshing inventories for resource provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 {{(pid=71474) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} Apr 21 14:14:08 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Updating ProviderTree inventory for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 from _refresh_and_get_inventory using data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} Apr 21 14:14:08 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Updating inventory in ProviderTree for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 with inventory: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} Apr 21 14:14:08 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Refreshing aggregate associations for resource provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7, aggregates: None {{(pid=71474) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} Apr 21 14:14:08 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Refreshing trait associations for resource provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7, traits: COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_GRAPHICS_MODEL_VMVGA,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE2,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_MMX,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_QXL,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSSE3,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE42,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_USB,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_BOCHS {{(pid=71474) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} Apr 21 14:14:08 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Inventory has not changed in ProviderTree for provider: 4e62c1ab-67bb-43ed-8389-61deb50e98d7 {{(pid=71474) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 14:14:08 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Inventory has not changed for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 14:14:08 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Compute_service record updated for user:user {{(pid=71474) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 21 14:14:08 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.479s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:14:08 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:14:08 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Cleaning up deleted instances {{(pid=71474) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11079}} Apr 21 14:14:08 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] There are 0 instances to clean {{(pid=71474) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11088}} Apr 21 14:14:09 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:14:09 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:14:09 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:14:10 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 14:14:10 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:14:12 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:14:12 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71474) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 21 14:14:15 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:14:20 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:14:22 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-e9d15536-32ce-442d-b780-dd8ad6dca655 tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] Acquiring lock "ff432b87-a8d9-4dfd-9abc-29f93ea545d0" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:14:22 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-e9d15536-32ce-442d-b780-dd8ad6dca655 tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] Lock "ff432b87-a8d9-4dfd-9abc-29f93ea545d0" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:14:22 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-e9d15536-32ce-442d-b780-dd8ad6dca655 tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] Acquiring lock "ff432b87-a8d9-4dfd-9abc-29f93ea545d0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:14:22 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-e9d15536-32ce-442d-b780-dd8ad6dca655 tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] Lock "ff432b87-a8d9-4dfd-9abc-29f93ea545d0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:14:22 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-e9d15536-32ce-442d-b780-dd8ad6dca655 tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] Lock "ff432b87-a8d9-4dfd-9abc-29f93ea545d0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:14:22 user nova-compute[71474]: INFO nova.compute.manager [None req-e9d15536-32ce-442d-b780-dd8ad6dca655 tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] [instance: ff432b87-a8d9-4dfd-9abc-29f93ea545d0] Terminating instance Apr 21 14:14:22 user nova-compute[71474]: DEBUG nova.compute.manager [None req-e9d15536-32ce-442d-b780-dd8ad6dca655 tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] [instance: ff432b87-a8d9-4dfd-9abc-29f93ea545d0] Start destroying the instance on the hypervisor. {{(pid=71474) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 21 14:14:22 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:14:23 user nova-compute[71474]: DEBUG nova.compute.manager [req-494044eb-39e1-4477-a322-d3689c2b5573 req-657c1f14-c87e-4b4b-8a05-95497986bc98 service nova] [instance: ff432b87-a8d9-4dfd-9abc-29f93ea545d0] Received event network-vif-unplugged-39a38a51-576d-4bf1-a4c1-013343ef291c {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:14:23 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-494044eb-39e1-4477-a322-d3689c2b5573 req-657c1f14-c87e-4b4b-8a05-95497986bc98 service nova] Acquiring lock "ff432b87-a8d9-4dfd-9abc-29f93ea545d0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:14:23 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-494044eb-39e1-4477-a322-d3689c2b5573 req-657c1f14-c87e-4b4b-8a05-95497986bc98 service nova] Lock "ff432b87-a8d9-4dfd-9abc-29f93ea545d0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:14:23 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-494044eb-39e1-4477-a322-d3689c2b5573 req-657c1f14-c87e-4b4b-8a05-95497986bc98 service nova] Lock "ff432b87-a8d9-4dfd-9abc-29f93ea545d0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:14:23 user nova-compute[71474]: DEBUG nova.compute.manager [req-494044eb-39e1-4477-a322-d3689c2b5573 req-657c1f14-c87e-4b4b-8a05-95497986bc98 service nova] [instance: ff432b87-a8d9-4dfd-9abc-29f93ea545d0] No waiting events found dispatching network-vif-unplugged-39a38a51-576d-4bf1-a4c1-013343ef291c {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:14:23 user nova-compute[71474]: DEBUG nova.compute.manager [req-494044eb-39e1-4477-a322-d3689c2b5573 req-657c1f14-c87e-4b4b-8a05-95497986bc98 service nova] [instance: ff432b87-a8d9-4dfd-9abc-29f93ea545d0] Received event network-vif-unplugged-39a38a51-576d-4bf1-a4c1-013343ef291c for instance with task_state deleting. {{(pid=71474) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 21 14:14:23 user nova-compute[71474]: INFO nova.virt.libvirt.driver [-] [instance: ff432b87-a8d9-4dfd-9abc-29f93ea545d0] Instance destroyed successfully. Apr 21 14:14:23 user nova-compute[71474]: DEBUG nova.objects.instance [None req-e9d15536-32ce-442d-b780-dd8ad6dca655 tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] Lazy-loading 'resources' on Instance uuid ff432b87-a8d9-4dfd-9abc-29f93ea545d0 {{(pid=71474) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 21 14:14:23 user nova-compute[71474]: DEBUG nova.virt.libvirt.vif [None req-e9d15536-32ce-442d-b780-dd8ad6dca655 tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-21T14:12:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-VolumesActionsTest-instance-1372418588',display_name='tempest-VolumesActionsTest-instance-1372418588',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesactionstest-instance-1372418588',id=25,image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-21T14:12:37Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='5c8eefb268d74734820c0faac7ddb131',ramdisk_id='',reservation_id='r-v10fxrmf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2edfef44-2867-4e03-a53e-b139f99afa75',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-VolumesActionsTest-1664293238',owner_user_name='tempest-VolumesActionsTest-1664293238-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-21T14:12:37Z,user_data=None,user_id='a1eadfd22cf841d8924dadda7cc29f6a',uuid=ff432b87-a8d9-4dfd-9abc-29f93ea545d0,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "39a38a51-576d-4bf1-a4c1-013343ef291c", "address": "fa:16:3e:3e:47:7e", "network": {"id": "42ccd7c8-88c9-488d-930f-e97bbf32973d", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1599304130-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "5c8eefb268d74734820c0faac7ddb131", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap39a38a51-57", "ovs_interfaceid": "39a38a51-576d-4bf1-a4c1-013343ef291c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 21 14:14:23 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-e9d15536-32ce-442d-b780-dd8ad6dca655 tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] Converting VIF {"id": "39a38a51-576d-4bf1-a4c1-013343ef291c", "address": "fa:16:3e:3e:47:7e", "network": {"id": "42ccd7c8-88c9-488d-930f-e97bbf32973d", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1599304130-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "5c8eefb268d74734820c0faac7ddb131", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap39a38a51-57", "ovs_interfaceid": "39a38a51-576d-4bf1-a4c1-013343ef291c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 21 14:14:23 user nova-compute[71474]: DEBUG nova.network.os_vif_util [None req-e9d15536-32ce-442d-b780-dd8ad6dca655 tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:3e:47:7e,bridge_name='br-int',has_traffic_filtering=True,id=39a38a51-576d-4bf1-a4c1-013343ef291c,network=Network(42ccd7c8-88c9-488d-930f-e97bbf32973d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap39a38a51-57') {{(pid=71474) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 21 14:14:23 user nova-compute[71474]: DEBUG os_vif [None req-e9d15536-32ce-442d-b780-dd8ad6dca655 tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:3e:47:7e,bridge_name='br-int',has_traffic_filtering=True,id=39a38a51-576d-4bf1-a4c1-013343ef291c,network=Network(42ccd7c8-88c9-488d-930f-e97bbf32973d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap39a38a51-57') {{(pid=71474) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 21 14:14:23 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:14:23 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap39a38a51-57, bridge=br-int, if_exists=True) {{(pid=71474) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 21 14:14:23 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:14:23 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 14:14:23 user nova-compute[71474]: INFO os_vif [None req-e9d15536-32ce-442d-b780-dd8ad6dca655 tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:3e:47:7e,bridge_name='br-int',has_traffic_filtering=True,id=39a38a51-576d-4bf1-a4c1-013343ef291c,network=Network(42ccd7c8-88c9-488d-930f-e97bbf32973d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap39a38a51-57') Apr 21 14:14:23 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-e9d15536-32ce-442d-b780-dd8ad6dca655 tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] [instance: ff432b87-a8d9-4dfd-9abc-29f93ea545d0] Deleting instance files /opt/stack/data/nova/instances/ff432b87-a8d9-4dfd-9abc-29f93ea545d0_del Apr 21 14:14:23 user nova-compute[71474]: INFO nova.virt.libvirt.driver [None req-e9d15536-32ce-442d-b780-dd8ad6dca655 tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] [instance: ff432b87-a8d9-4dfd-9abc-29f93ea545d0] Deletion of /opt/stack/data/nova/instances/ff432b87-a8d9-4dfd-9abc-29f93ea545d0_del complete Apr 21 14:14:23 user nova-compute[71474]: INFO nova.compute.manager [None req-e9d15536-32ce-442d-b780-dd8ad6dca655 tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] [instance: ff432b87-a8d9-4dfd-9abc-29f93ea545d0] Took 0.68 seconds to destroy the instance on the hypervisor. Apr 21 14:14:23 user nova-compute[71474]: DEBUG oslo.service.loopingcall [None req-e9d15536-32ce-442d-b780-dd8ad6dca655 tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71474) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 21 14:14:23 user nova-compute[71474]: DEBUG nova.compute.manager [-] [instance: ff432b87-a8d9-4dfd-9abc-29f93ea545d0] Deallocating network for instance {{(pid=71474) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 21 14:14:23 user nova-compute[71474]: DEBUG nova.network.neutron [-] [instance: ff432b87-a8d9-4dfd-9abc-29f93ea545d0] deallocate_for_instance() {{(pid=71474) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 21 14:14:24 user nova-compute[71474]: DEBUG nova.network.neutron [-] [instance: ff432b87-a8d9-4dfd-9abc-29f93ea545d0] Updating instance_info_cache with network_info: [] {{(pid=71474) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 21 14:14:24 user nova-compute[71474]: INFO nova.compute.manager [-] [instance: ff432b87-a8d9-4dfd-9abc-29f93ea545d0] Took 0.42 seconds to deallocate network for instance. Apr 21 14:14:24 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-e9d15536-32ce-442d-b780-dd8ad6dca655 tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:14:24 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-e9d15536-32ce-442d-b780-dd8ad6dca655 tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:14:24 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-e9d15536-32ce-442d-b780-dd8ad6dca655 tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] Inventory has not changed in ProviderTree for provider: 4e62c1ab-67bb-43ed-8389-61deb50e98d7 {{(pid=71474) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 14:14:24 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-e9d15536-32ce-442d-b780-dd8ad6dca655 tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] Inventory has not changed for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 14:14:24 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-e9d15536-32ce-442d-b780-dd8ad6dca655 tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.116s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:14:24 user nova-compute[71474]: INFO nova.scheduler.client.report [None req-e9d15536-32ce-442d-b780-dd8ad6dca655 tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] Deleted allocations for instance ff432b87-a8d9-4dfd-9abc-29f93ea545d0 Apr 21 14:14:24 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-e9d15536-32ce-442d-b780-dd8ad6dca655 tempest-VolumesActionsTest-1664293238 tempest-VolumesActionsTest-1664293238-project-member] Lock "ff432b87-a8d9-4dfd-9abc-29f93ea545d0" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.412s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:14:25 user nova-compute[71474]: DEBUG nova.compute.manager [req-48da5423-e9fc-4ec5-afd0-1d7bac5aa8fa req-3a70199c-6c4b-4785-a4d4-1f9baebf8436 service nova] [instance: ff432b87-a8d9-4dfd-9abc-29f93ea545d0] Received event network-vif-plugged-39a38a51-576d-4bf1-a4c1-013343ef291c {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:14:25 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-48da5423-e9fc-4ec5-afd0-1d7bac5aa8fa req-3a70199c-6c4b-4785-a4d4-1f9baebf8436 service nova] Acquiring lock "ff432b87-a8d9-4dfd-9abc-29f93ea545d0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:14:25 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-48da5423-e9fc-4ec5-afd0-1d7bac5aa8fa req-3a70199c-6c4b-4785-a4d4-1f9baebf8436 service nova] Lock "ff432b87-a8d9-4dfd-9abc-29f93ea545d0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:14:25 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [req-48da5423-e9fc-4ec5-afd0-1d7bac5aa8fa req-3a70199c-6c4b-4785-a4d4-1f9baebf8436 service nova] Lock "ff432b87-a8d9-4dfd-9abc-29f93ea545d0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:14:25 user nova-compute[71474]: DEBUG nova.compute.manager [req-48da5423-e9fc-4ec5-afd0-1d7bac5aa8fa req-3a70199c-6c4b-4785-a4d4-1f9baebf8436 service nova] [instance: ff432b87-a8d9-4dfd-9abc-29f93ea545d0] No waiting events found dispatching network-vif-plugged-39a38a51-576d-4bf1-a4c1-013343ef291c {{(pid=71474) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 21 14:14:25 user nova-compute[71474]: WARNING nova.compute.manager [req-48da5423-e9fc-4ec5-afd0-1d7bac5aa8fa req-3a70199c-6c4b-4785-a4d4-1f9baebf8436 service nova] [instance: ff432b87-a8d9-4dfd-9abc-29f93ea545d0] Received unexpected event network-vif-plugged-39a38a51-576d-4bf1-a4c1-013343ef291c for instance with vm_state deleted and task_state None. Apr 21 14:14:25 user nova-compute[71474]: DEBUG nova.compute.manager [req-48da5423-e9fc-4ec5-afd0-1d7bac5aa8fa req-3a70199c-6c4b-4785-a4d4-1f9baebf8436 service nova] [instance: ff432b87-a8d9-4dfd-9abc-29f93ea545d0] Received event network-vif-deleted-39a38a51-576d-4bf1-a4c1-013343ef291c {{(pid=71474) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 21 14:14:28 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:14:32 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:14:32 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:14:33 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:14:38 user nova-compute[71474]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71474) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 21 14:14:38 user nova-compute[71474]: INFO nova.compute.manager [-] [instance: ff432b87-a8d9-4dfd-9abc-29f93ea545d0] VM Stopped (Lifecycle Event) Apr 21 14:14:38 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 14:14:38 user nova-compute[71474]: DEBUG nova.compute.manager [None req-d7b7260e-7c0b-4a6a-bf5c-9c0efd3a363e None None] [instance: ff432b87-a8d9-4dfd-9abc-29f93ea545d0] Checking state {{(pid=71474) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 21 14:14:43 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 14:14:48 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 14:14:48 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:14:48 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71474) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 21 14:14:48 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71474) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 14:14:48 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71474) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 14:14:48 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:14:52 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:14:53 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:14:54 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:14:56 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:14:58 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:14:59 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:15:01 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:15:01 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:15:02 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:15:03 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:15:04 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:15:04 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:15:06 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:15:06 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:15:06 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:15:06 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:15:06 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:15:06 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Auditing locally available compute resources for user (node: user) {{(pid=71474) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 21 14:15:07 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 14:15:07 user nova-compute[71474]: WARNING nova.virt.libvirt.driver [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 21 14:15:07 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Hypervisor/Node resource view: name=user free_ram=9161MB free_disk=26.122299194335938GB free_vcpus=12 pci_devices=[{"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71474) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 21 14:15:07 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 21 14:15:07 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 21 14:15:07 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Total usable vcpus: 12, total allocated vcpus: 0 {{(pid=71474) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 21 14:15:07 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Final resource view: name=user phys_ram=16023MB used_ram=512MB phys_disk=40GB used_disk=0GB total_vcpus=12 used_vcpus=0 pci_stats=[] {{(pid=71474) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 21 14:15:07 user nova-compute[71474]: DEBUG nova.compute.provider_tree [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Inventory has not changed in ProviderTree for provider: 4e62c1ab-67bb-43ed-8389-61deb50e98d7 {{(pid=71474) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 21 14:15:07 user nova-compute[71474]: DEBUG nova.scheduler.client.report [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Inventory has not changed for provider 4e62c1ab-67bb-43ed-8389-61deb50e98d7 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71474) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 21 14:15:07 user nova-compute[71474]: DEBUG nova.compute.resource_tracker [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Compute_service record updated for user:user {{(pid=71474) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 21 14:15:07 user nova-compute[71474]: DEBUG oslo_concurrency.lockutils [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.141s {{(pid=71474) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 21 14:15:08 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:15:08 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:15:08 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:15:08 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Starting heal instance info cache {{(pid=71474) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 21 14:15:08 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Rebuilding the list of instances to heal {{(pid=71474) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 21 14:15:08 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Didn't find any instances for network info cache update. {{(pid=71474) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 21 14:15:09 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._sync_power_states {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:15:09 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:15:10 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:15:10 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:15:11 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:15:13 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:15:13 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:15:13 user nova-compute[71474]: DEBUG oslo_service.periodic_task [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71474) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 21 14:15:13 user nova-compute[71474]: DEBUG nova.compute.manager [None req-9735158e-337c-4f69-906b-f91d38c505b5 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71474) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 21 14:15:18 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 14:15:18 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:15:18 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71474) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 21 14:15:18 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71474) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 14:15:18 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71474) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 14:15:18 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:15:23 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:15:28 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 14:15:33 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 14:15:33 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:15:33 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71474) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 21 14:15:33 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71474) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 14:15:33 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71474) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 14:15:33 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:15:38 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 14:15:38 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:15:38 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71474) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 21 14:15:38 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71474) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 14:15:38 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71474) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 14:15:38 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:15:43 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 21 14:15:43 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:15:43 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71474) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 21 14:15:43 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71474) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 14:15:43 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71474) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 21 14:15:43 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 21 14:15:48 user nova-compute[71474]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71474) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}}